Feature: Follow-up Prompts (#3280)
* Add migrations - add follow up prompts column to chatflow and chat message * Add configuration tab for follow-up prompts * Add follow up prompts functionality * Pin zod version in components - this was causing a type error with structured outputs * Generate follow up prompts if enabled and return it in stream, response, and save to database * Show follow up prompts after getting response * Add google gen ai for generating follow up prompts and fix issues * Add config for google gen ai and update model options * Update follow-up prompts ui and styles * Release/2.1.0 (#3204) flowise@2.1.0 release * Chore/update flowise embed version to 2.0.0 (#3205) * update flowise-embed version on lock file * add agent messages to share chatbot * Update pnpm-lock.yaml * update flowise-embed version * update flowise-embed to 1.3.9 * update embed version to 2.0 * Bugfix/CodeInterpreter E2B Credential (#3206) * Base changes for ServerSide Events (instead of socket.io) * lint fixes * adding of interface and separate methods for streaming events * lint * first draft, handles both internal and external prediction end points. * lint fixes * additional internal end point for streaming and associated changes * return streamresponse as true to build agent flow * 1) JSON formatting for internal events 2) other fixes * 1) convert internal event to metadata to maintain consistency with external response * fix action and metadata streaming * fix for error when agent flow is aborted * prevent subflows from streaming and other code cleanup * prevent streaming from enclosed tools * add fix for preventing chaintool streaming * update lock file * add open when hidden to sse * Streaming errors * Streaming errors * add fix for showing error message * add code interpreter * add artifacts to view message dialog * Update pnpm-lock.yaml * uncomment e2b credential --------- Co-authored-by: Vinod Paidimarry <vinodkiran@outlook.in> * Release/2.1.0 (#3207) * flowise@2.1.0 release * update flowise-components@2.1.1 * Bugfix/Add artifacts migration script to other database types (#3210) add artifacts migration script to other database types * Release/2.1.1 (#3213) release @2.1.1 * Bugfix/Add header to allow sse on nginx (#3214) add header to allow sse on nginx * Bugfix/remove invalid markdown (#3219) remove invalid markdown * Correct "as" casing (#3216) * Correct "as" casing * Remove "version" line from docker compose file * Update docker-compose.yml --------- Co-authored-by: Henry Heng <henryheng@flowiseai.com> * chore: update unstructured API url and doc reference (#3224) chore: udpate unstructured API url and doc reference * Feature/add ability to specify dynamic metadata to jsonlines (#3238) * add ability to specify dynamic metadata to jsonlines * fix additional metadata * Bugfix/Buffer Memory for Anthropic (#3242) fix buffer memory * Added env vars to ui and api URL (#3141) * feat: add environment vars to split application in different deployments for better scalability * update: package.json added start script ui --------- Co-authored-by: patrick <patrick.alves@br.experian.com> * Added 1-click deployment link for Alibaba Cloud. (#3251) * Added a link for Alibaba Cloud Deployment * change service name --------- Co-authored-by: yehan <gn398171@alibaba-inc.com> * Chore/Groq Llama3.2 (#3255) * add gemini flash * add gemin flash to vertex * add gemin-1.5-flash-preview to vertex * add azure gpt 4o * add claude 3.5 sonnet * add mistral nemo * add groq llama3.1 * add gpt4o-mini to azure * o1 mini * add groq llama 3.2 * Bugfix/Prevent streaming of chatflow tool and chain tool (#3257) prevent streaming of chatflow tool and chain tool * Bugfix/Enable Custom Tool Optional Input Schema (#3258) * prevent streaming of chatflow tool and chain tool * enable optional input schema * Bugfix/Searxng tool not working (#3263) fix searxng tool not working * LunaryAI automatic Thread and User tracking (#3233) * Lunary Thread/User tracking * Clean console logs * Clean * Remove commented lines * Remove commented line * feat: enable autofocus to the `new chatflow title` to improve usability (#3260) This dialog has only one input and it is the primary one, there is no need for an extra click to be able to set the title * feat: save a new Chatflow when the `ENTER` key is pressed (#3261) This simple event handler improve the usability of the UI by avoiding having to use the mouse or having to tab twice and then hit enter to save a flow * feat: save Chatflow title when the `ENTER` key is pressed or discard upon `ESC` is pressed (#3265) This simple event handler improves the usability of the UI by avoiding having to use the mouse to save or dicard title changes * feat: enable autofocus to the `edit chatflow title` field to improve UI usability (#3264) feat: enable autofocus to the `edit chatflow title` field to improve usability The canvas header has only one input and it is the primary one, there is no need for an extra click to be able to edit the title * feat: add search keyboard shortcut based on the current platform (#3267) * feat: highlight valid/invalid connection between nodes (#3266) Change the inputs background to green/red to hint compatible connections, in adition to the `not-allowed` mouse cursor for incompatible connections * Bugfix/add fixes for search of view header (#3271) add fixes for search of view header * fix: warning when passing a boolean to border property of a Card (#3275) By default MainCard wrappers like NodeCardWrapper and CardWrapper add a a solid border of 1px, but if the `MainCard.border` prop is used (`false`) the border prop was wrongly set to a boolean instead of string * feat: add shortcut text hint to the search field (#3269) * feat: add shortcut text hint to the search field * fix: search box width to fit the shortcut hint text * fix: error when not running on Mac due to an undefined `os` variable * fix: warning when a non-boolean values was used to set `checked` prop of a SwitchInput component (#3276) fix: warning when a non-boolean values was used to set`checked` prop of SwitchInput component The problem was that in the useEffect hook the plain value was used without validation like in useState * Bugfix/Throw error to prevent SSE from retrying (#3281) throw error to prevent SSE from retrying * Pin zod version in components - this was causing a type error with structured outputs * Fix conflicts in pnpm lock * fix ui changes for follow up prompts * Fix button disable state in follow-up prompts configuration * Fix follow-up prompts not showing up for agent flows * Show follow up prompts if last message is apiMessage and follow up prompts are available --------- Co-authored-by: Henry Heng <henryheng@flowiseai.com> Co-authored-by: Vinod Paidimarry <vinodkiran@outlook.in> Co-authored-by: Cross <github@dillfrescott.com> Co-authored-by: cragwolfe <cragcw@gmail.com> Co-authored-by: patrickreinan <patrickreinan@gmail.com> Co-authored-by: patrick <patrick.alves@br.experian.com> Co-authored-by: yehan <34835250+yehanyh@users.noreply.github.com> Co-authored-by: yehan <gn398171@alibaba-inc.com> Co-authored-by: Vincelwt <vincelwt@users.noreply.github.com> Co-authored-by: Humberto Rodríguez A. <rhumbertgz@users.noreply.github.com> Co-authored-by: Henry <hzj94@hotmail.com>
This commit is contained in:
parent
490855729b
commit
c9d8b8716b
|
|
@ -119,7 +119,7 @@
|
|||
"weaviate-ts-client": "^1.1.0",
|
||||
"winston": "^3.9.0",
|
||||
"ws": "^8.9.0",
|
||||
"zod": "^3.22.4",
|
||||
"zod": "3.22.4",
|
||||
"zod-to-json-schema": "^3.21.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
|
|
|||
|
|
@ -419,3 +419,25 @@ export interface IServerSideEventStreamer {
|
|||
streamAbortEvent(chatId: string): void
|
||||
streamEndEvent(chatId: string): void
|
||||
}
|
||||
|
||||
export enum FollowUpPromptProvider {
|
||||
ANTHROPIC = 'chatAnthropic',
|
||||
AZURE_OPENAI = 'azureChatOpenAI',
|
||||
GOOGLE_GENAI = 'chatGoogleGenerativeAI',
|
||||
MISTRALAI = 'chatMistralAI',
|
||||
OPENAI = 'chatOpenAI'
|
||||
}
|
||||
|
||||
export type FollowUpPromptProviderConfig = {
|
||||
[key in FollowUpPromptProvider]: {
|
||||
credentialId: string
|
||||
modelName: string
|
||||
prompt: string
|
||||
temperature: string
|
||||
}
|
||||
}
|
||||
|
||||
export type FollowUpPromptConfig = {
|
||||
status: boolean
|
||||
selectedProvider: FollowUpPromptProvider
|
||||
} & FollowUpPromptProviderConfig
|
||||
|
|
|
|||
|
|
@ -0,0 +1,113 @@
|
|||
import { FollowUpPromptConfig, FollowUpPromptProvider, ICommonObject } from './Interface'
|
||||
import { getCredentialData } from './utils'
|
||||
import { ChatAnthropic } from '@langchain/anthropic'
|
||||
import { ChatGoogleGenerativeAI } from '@langchain/google-genai'
|
||||
import { ChatMistralAI } from '@langchain/mistralai'
|
||||
import { ChatOpenAI } from '@langchain/openai'
|
||||
import { z } from 'zod'
|
||||
import { PromptTemplate } from '@langchain/core/prompts'
|
||||
import { StructuredOutputParser } from '@langchain/core/output_parsers'
|
||||
|
||||
const FollowUpPromptType = z
|
||||
.object({
|
||||
questions: z.array(z.string())
|
||||
})
|
||||
.describe('Generate Follow Up Prompts')
|
||||
|
||||
export const generateFollowUpPrompts = async (
|
||||
followUpPromptsConfig: FollowUpPromptConfig,
|
||||
apiMessageContent: string,
|
||||
options: ICommonObject
|
||||
) => {
|
||||
if (followUpPromptsConfig) {
|
||||
const providerConfig = followUpPromptsConfig[followUpPromptsConfig.selectedProvider]
|
||||
const credentialId = providerConfig.credentialId as string
|
||||
const credentialData = await getCredentialData(credentialId ?? '', options)
|
||||
const followUpPromptsPrompt = providerConfig.prompt.replace('{history}', apiMessageContent)
|
||||
|
||||
switch (followUpPromptsConfig.selectedProvider) {
|
||||
case FollowUpPromptProvider.ANTHROPIC: {
|
||||
const llm = new ChatAnthropic({
|
||||
apiKey: credentialData.anthropicApiKey,
|
||||
model: providerConfig.modelName,
|
||||
temperature: parseFloat(`${providerConfig.temperature}`)
|
||||
})
|
||||
const structuredLLM = llm.withStructuredOutput(FollowUpPromptType)
|
||||
const structuredResponse = await structuredLLM.invoke(followUpPromptsPrompt)
|
||||
return structuredResponse
|
||||
}
|
||||
case FollowUpPromptProvider.AZURE_OPENAI: {
|
||||
const azureOpenAIApiKey = credentialData['azureOpenAIApiKey']
|
||||
const azureOpenAIApiInstanceName = credentialData['azureOpenAIApiInstanceName']
|
||||
const azureOpenAIApiDeploymentName = credentialData['azureOpenAIApiDeploymentName']
|
||||
const azureOpenAIApiVersion = credentialData['azureOpenAIApiVersion']
|
||||
|
||||
const llm = new ChatOpenAI({
|
||||
azureOpenAIApiKey,
|
||||
azureOpenAIApiInstanceName,
|
||||
azureOpenAIApiDeploymentName,
|
||||
azureOpenAIApiVersion,
|
||||
model: providerConfig.modelName,
|
||||
temperature: parseFloat(`${providerConfig.temperature}`)
|
||||
})
|
||||
// use structured output parser because withStructuredOutput is not working
|
||||
const parser = StructuredOutputParser.fromZodSchema(FollowUpPromptType)
|
||||
const formatInstructions = parser.getFormatInstructions()
|
||||
const prompt = PromptTemplate.fromTemplate(`
|
||||
${providerConfig.prompt}
|
||||
|
||||
{format_instructions}
|
||||
`)
|
||||
const chain = prompt.pipe(llm).pipe(parser)
|
||||
const structuredResponse = await chain.invoke({
|
||||
history: apiMessageContent,
|
||||
format_instructions: formatInstructions
|
||||
})
|
||||
return structuredResponse
|
||||
}
|
||||
case FollowUpPromptProvider.GOOGLE_GENAI: {
|
||||
const llm = new ChatGoogleGenerativeAI({
|
||||
apiKey: credentialData.googleGenerativeAPIKey,
|
||||
model: providerConfig.modelName,
|
||||
temperature: parseFloat(`${providerConfig.temperature}`)
|
||||
})
|
||||
// use structured output parser because withStructuredOutput is not working
|
||||
const parser = StructuredOutputParser.fromZodSchema(FollowUpPromptType)
|
||||
const formatInstructions = parser.getFormatInstructions()
|
||||
const prompt = PromptTemplate.fromTemplate(`
|
||||
${providerConfig.prompt}
|
||||
|
||||
{format_instructions}
|
||||
`)
|
||||
const chain = prompt.pipe(llm).pipe(parser)
|
||||
const structuredResponse = await chain.invoke({
|
||||
history: apiMessageContent,
|
||||
format_instructions: formatInstructions
|
||||
})
|
||||
return structuredResponse
|
||||
}
|
||||
case FollowUpPromptProvider.MISTRALAI: {
|
||||
const model = new ChatMistralAI({
|
||||
apiKey: credentialData.mistralAIAPIKey,
|
||||
model: providerConfig.modelName,
|
||||
temperature: parseFloat(`${providerConfig.temperature}`)
|
||||
})
|
||||
const structuredLLM = model.withStructuredOutput(FollowUpPromptType)
|
||||
const structuredResponse = await structuredLLM.invoke(followUpPromptsPrompt)
|
||||
return structuredResponse
|
||||
}
|
||||
case FollowUpPromptProvider.OPENAI: {
|
||||
const model = new ChatOpenAI({
|
||||
apiKey: credentialData.openAIApiKey,
|
||||
model: providerConfig.modelName,
|
||||
temperature: parseFloat(`${providerConfig.temperature}`)
|
||||
})
|
||||
const structuredLLM = model.withStructuredOutput(FollowUpPromptType)
|
||||
const structuredResponse = await structuredLLM.invoke(followUpPromptsPrompt)
|
||||
return structuredResponse
|
||||
}
|
||||
}
|
||||
} else {
|
||||
return undefined
|
||||
}
|
||||
}
|
||||
|
|
@ -9,3 +9,4 @@ export * from './utils'
|
|||
export * from './speechToText'
|
||||
export * from './storageUtils'
|
||||
export * from './handler'
|
||||
export * from './followUpPrompts'
|
||||
|
|
|
|||
|
|
@ -27,6 +27,7 @@ export interface IChatFlow {
|
|||
apikeyid?: string
|
||||
analytic?: string
|
||||
chatbotConfig?: string
|
||||
followUpPrompts?: string
|
||||
apiConfig?: string
|
||||
category?: string
|
||||
type?: ChatflowType
|
||||
|
|
@ -50,6 +51,7 @@ export interface IChatMessage {
|
|||
createdDate: Date
|
||||
leadEmail?: string
|
||||
action?: string | null
|
||||
followUpPrompts?: string
|
||||
}
|
||||
|
||||
export interface IChatMessageFeedback {
|
||||
|
|
|
|||
|
|
@ -34,6 +34,9 @@ export class ChatFlow implements IChatFlow {
|
|||
@Column({ nullable: true, type: 'text' })
|
||||
speechToText?: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
followUpPrompts?: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
category?: string
|
||||
|
||||
|
|
|
|||
|
|
@ -56,4 +56,7 @@ export class ChatMessage implements IChatMessage {
|
|||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
leadEmail?: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
followUpPrompts?: string
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,14 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddFollowUpPrompts1726666318346 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
const columnExistsInChatflow = await queryRunner.hasColumn('chat_flow', 'followUpPrompts')
|
||||
if (!columnExistsInChatflow) queryRunner.query(`ALTER TABLE \`chat_flow\` ADD COLUMN \`followUpPrompts\` TEXT;`)
|
||||
const columnExistsInChatMessage = await queryRunner.hasColumn('chat_flow', 'followUpPrompts')
|
||||
if (!columnExistsInChatMessage) queryRunner.query(`ALTER TABLE \`chat_flow\` ADD COLUMN \`followUpPrompts\` TEXT;`)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE \`chat_flow\` DROP COLUMN \`followUpPrompts\`;`)
|
||||
}
|
||||
}
|
||||
|
|
@ -25,6 +25,7 @@ import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionTo
|
|||
import { LongTextColumn1722301395521 } from './1722301395521-LongTextColumn'
|
||||
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
|
||||
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
|
||||
import { AddFollowUpPrompts1726666318346 } from './1726666318346-AddFollowUpPrompts'
|
||||
|
||||
export const mariadbMigrations = [
|
||||
Init1693840429259,
|
||||
|
|
@ -53,5 +54,6 @@ export const mariadbMigrations = [
|
|||
AddActionToChatMessage1721078251523,
|
||||
LongTextColumn1722301395521,
|
||||
AddCustomTemplate1725629836652,
|
||||
AddArtifactsToChatMessage1726156258465
|
||||
AddArtifactsToChatMessage1726156258465,
|
||||
AddFollowUpPrompts1726666318346
|
||||
]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,15 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddFollowUpPrompts1726666302024 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
const columnExistsInChatflow = await queryRunner.hasColumn('chat_flow', 'followUpPrompts')
|
||||
if (!columnExistsInChatflow) queryRunner.query(`ALTER TABLE \`chat_flow\` ADD COLUMN \`followUpPrompts\` TEXT;`)
|
||||
const columnExistsInChatMessage = await queryRunner.hasColumn('chat_message', 'followUpPrompts')
|
||||
if (!columnExistsInChatMessage) queryRunner.query(`ALTER TABLE \`chat_message\` ADD COLUMN \`followUpPrompts\` TEXT;`)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE \`chat_flow\` DROP COLUMN \`followUpPrompts\`;`)
|
||||
await queryRunner.query(`ALTER TABLE \`chat_message\` DROP COLUMN \`followUpPrompts\`;`)
|
||||
}
|
||||
}
|
||||
|
|
@ -26,6 +26,7 @@ import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionTo
|
|||
import { LongTextColumn1722301395521 } from './1722301395521-LongTextColumn'
|
||||
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
|
||||
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
|
||||
import { AddFollowUpPrompts1726666302024 } from './1726666302024-AddFollowUpPrompts'
|
||||
|
||||
export const mysqlMigrations = [
|
||||
Init1693840429259,
|
||||
|
|
@ -55,5 +56,6 @@ export const mysqlMigrations = [
|
|||
AddActionToChatMessage1721078251523,
|
||||
LongTextColumn1722301395521,
|
||||
AddCustomTemplate1725629836652,
|
||||
AddArtifactsToChatMessage1726156258465
|
||||
AddArtifactsToChatMessage1726156258465,
|
||||
AddFollowUpPrompts1726666302024
|
||||
]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,13 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddFollowUpPrompts1726666309552 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "chat_flow" ADD COLUMN IF NOT EXISTS "followUpPrompts" TEXT;`)
|
||||
await queryRunner.query(`ALTER TABLE "chat_message" ADD COLUMN IF NOT EXISTS "followUpPrompts" TEXT;`)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "chat_flow" DROP COLUMN "followUpPrompts";`)
|
||||
await queryRunner.query(`ALTER TABLE "chat_message" DROP COLUMN "followUpPrompts";`)
|
||||
}
|
||||
}
|
||||
|
|
@ -26,6 +26,7 @@ import { AddApiKey1720230151480 } from './1720230151480-AddApiKey'
|
|||
import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionToChatMessage'
|
||||
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
|
||||
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
|
||||
import { AddFollowUpPrompts1726666309552 } from './1726666309552-AddFollowUpPrompts'
|
||||
|
||||
export const postgresMigrations = [
|
||||
Init1693891895163,
|
||||
|
|
@ -55,5 +56,6 @@ export const postgresMigrations = [
|
|||
AddApiKey1720230151480,
|
||||
AddActionToChatMessage1721078251523,
|
||||
AddCustomTemplate1725629836652,
|
||||
AddArtifactsToChatMessage1726156258465
|
||||
AddArtifactsToChatMessage1726156258465,
|
||||
AddFollowUpPrompts1726666309552
|
||||
]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,13 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddFollowUpPrompts1726666294213 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "chat_flow" ADD COLUMN "followUpPrompts" TEXT;`)
|
||||
await queryRunner.query(`ALTER TABLE "chat_message" ADD COLUMN "followUpPrompts" TEXT;`)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "chat_flow" DROP COLUMN "followUpPrompts";`)
|
||||
await queryRunner.query(`ALTER TABLE "chat_message" DROP COLUMN "followUpPrompts";`)
|
||||
}
|
||||
}
|
||||
|
|
@ -25,6 +25,7 @@ import { AddApiKey1720230151480 } from './1720230151480-AddApiKey'
|
|||
import { AddActionToChatMessage1721078251523 } from './1721078251523-AddActionToChatMessage'
|
||||
import { AddArtifactsToChatMessage1726156258465 } from './1726156258465-AddArtifactsToChatMessage'
|
||||
import { AddCustomTemplate1725629836652 } from './1725629836652-AddCustomTemplate'
|
||||
import { AddFollowUpPrompts1726666294213 } from './1726666294213-AddFollowUpPrompts'
|
||||
|
||||
export const sqliteMigrations = [
|
||||
Init1693835579790,
|
||||
|
|
@ -53,5 +54,6 @@ export const sqliteMigrations = [
|
|||
AddApiKey1720230151480,
|
||||
AddActionToChatMessage1721078251523,
|
||||
AddArtifactsToChatMessage1726156258465,
|
||||
AddCustomTemplate1725629836652
|
||||
AddCustomTemplate1725629836652,
|
||||
AddFollowUpPrompts1726666294213
|
||||
]
|
||||
|
|
|
|||
|
|
@ -211,6 +211,9 @@ export class SSEStreamer implements IServerSideEventStreamer {
|
|||
if (apiResponse.memoryType) {
|
||||
metadataJson['memoryType'] = apiResponse.memoryType
|
||||
}
|
||||
if (apiResponse.followUpPrompts) {
|
||||
metadataJson['followUpPrompts'] = JSON.parse(apiResponse.followUpPrompts)
|
||||
}
|
||||
if (Object.keys(metadataJson).length > 0) {
|
||||
this.streamCustomEvent(chatId, 'metadata', metadataJson)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ import {
|
|||
addArrayFilesToStorage,
|
||||
mapMimeTypeToInputField,
|
||||
mapExtToInputField,
|
||||
generateFollowUpPrompts,
|
||||
IServerSideEventStreamer
|
||||
} from 'flowise-components'
|
||||
import { StatusCodes } from 'http-status-codes'
|
||||
|
|
@ -452,6 +453,18 @@ export const utilBuildChatflow = async (req: Request, isInternal: boolean = fals
|
|||
if (result?.usedTools) apiMessage.usedTools = JSON.stringify(result.usedTools)
|
||||
if (result?.fileAnnotations) apiMessage.fileAnnotations = JSON.stringify(result.fileAnnotations)
|
||||
if (result?.artifacts) apiMessage.artifacts = JSON.stringify(result.artifacts)
|
||||
if (chatflow.followUpPrompts) {
|
||||
const followUpPromptsConfig = JSON.parse(chatflow.followUpPrompts)
|
||||
const followUpPrompts = await generateFollowUpPrompts(followUpPromptsConfig, apiMessage.content, {
|
||||
chatId,
|
||||
chatflowid,
|
||||
appDataSource: appServer.AppDataSource,
|
||||
databaseEntities
|
||||
})
|
||||
if (followUpPrompts?.questions) {
|
||||
apiMessage.followUpPrompts = JSON.stringify(followUpPrompts.questions)
|
||||
}
|
||||
}
|
||||
|
||||
const chatMessage = await utilAddChatMessage(apiMessage)
|
||||
|
||||
|
|
@ -470,6 +483,7 @@ export const utilBuildChatflow = async (req: Request, isInternal: boolean = fals
|
|||
result.question = incomingInput.question
|
||||
result.chatId = chatId
|
||||
result.chatMessageId = chatMessage?.id
|
||||
result.followUpPrompts = JSON.stringify(apiMessage.followUpPrompts)
|
||||
result.isStreamValid = isStreamValid
|
||||
|
||||
if (sessionId) result.sessionId = sessionId
|
||||
|
|
@ -543,6 +557,18 @@ const utilBuildAgentResponse = async (
|
|||
if (usedTools?.length) apiMessage.usedTools = JSON.stringify(usedTools)
|
||||
if (agentReasoning?.length) apiMessage.agentReasoning = JSON.stringify(agentReasoning)
|
||||
if (finalAction && Object.keys(finalAction).length) apiMessage.action = JSON.stringify(finalAction)
|
||||
if (agentflow.followUpPrompts) {
|
||||
const followUpPromptsConfig = JSON.parse(agentflow.followUpPrompts)
|
||||
const generatedFollowUpPrompts = await generateFollowUpPrompts(followUpPromptsConfig, apiMessage.content, {
|
||||
chatId,
|
||||
chatflowid: agentflow.id,
|
||||
appDataSource: appServer.AppDataSource,
|
||||
databaseEntities
|
||||
})
|
||||
if (generatedFollowUpPrompts?.questions) {
|
||||
apiMessage.followUpPrompts = JSON.stringify(generatedFollowUpPrompts.questions)
|
||||
}
|
||||
}
|
||||
const chatMessage = await utilAddChatMessage(apiMessage)
|
||||
|
||||
await appServer.telemetry.sendTelemetry('agentflow_prediction_sent', {
|
||||
|
|
@ -591,6 +617,7 @@ const utilBuildAgentResponse = async (
|
|||
if (memoryType) result.memoryType = memoryType
|
||||
if (agentReasoning?.length) result.agentReasoning = agentReasoning
|
||||
if (finalAction && Object.keys(finalAction).length) result.action = finalAction
|
||||
result.followUpPrompts = JSON.stringify(apiMessage.followUpPrompts)
|
||||
|
||||
return result
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1 @@
|
|||
<svg width="32" height="32" fill="none" xmlns="http://www.w3.org/2000/svg"><circle cx="16" cy="16" r="14" fill="#CC9B7A"/><path d="m10 21 4.5-10L19 21m-7.2-2.857h5.4M18.5 11 23 21" stroke="#1F1F1E" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/></svg>
|
||||
|
After Width: | Height: | Size: 269 B |
|
|
@ -0,0 +1 @@
|
|||
<svg width="32" height="32" fill="none" xmlns="http://www.w3.org/2000/svg"><path d="M11.946 5H19l-7.322 22.216a1.15 1.15 0 0 1-.41.568c-.19.14-.42.216-.656.216H5.123a1.11 1.11 0 0 1-.513-.127 1.132 1.132 0 0 1-.4-.352 1.165 1.165 0 0 1-.151-1.038l6.822-20.7a1.15 1.15 0 0 1 .41-.567c.19-.14.42-.216.655-.216Z" fill="#0A5FAB"/><path d="M22.334 20H11.502c-.1 0-.2.031-.282.09a.52.52 0 0 0-.185.241.545.545 0 0 0 .125.576l6.96 6.786c.203.197.47.307.747.307H25l-2.666-8Z" fill="#0078D4"/><path d="M21.035 5.782a1.149 1.149 0 0 0-.415-.566A1.128 1.128 0 0 0 19.957 5H12c.238 0 .47.076.663.216.193.14.338.338.414.566l6.906 20.7a1.16 1.16 0 0 1-.558 1.391 1.12 1.12 0 0 1-.52.127h7.959a1.127 1.127 0 0 0 .923-.48 1.159 1.159 0 0 0 .153-1.038l-6.905-20.7Z" fill="#2C9DE3"/></svg>
|
||||
|
After Width: | Height: | Size: 771 B |
|
|
@ -0,0 +1 @@
|
|||
<svg width="32" height="32" fill="none" xmlns="http://www.w3.org/2000/svg"><path d="M5 6H4v19.5h1m8-7.5v3h1m7-11.5V6h1m-5 7.5V10h1" stroke="#000" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/><mask id="MistralAI__a" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="5" y="6" width="22" height="20"><path d="M5 6v19.5h5v-8h4V21h4v-3.5h4V25h5V6h-4.5v4H18v3.5h-4v-4h-4V6H5Z" fill="#FD7000"/></mask><g mask="url(#MistralAI__a)"><path fill="#FFCD00" d="M4 6h25v4H4z"/></g><mask id="MistralAI__b" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="5" y="6" width="22" height="20"><path d="M5 6v19.5h5v-8h4V21h4v-3.5h4V25h5V6h-4.5v4H18v3.5h-4v-4h-4V6H5Z" fill="#FD7000"/></mask><g mask="url(#MistralAI__b)"><path fill="#FFA200" d="M4 10h25v4H4z"/></g><mask id="MistralAI__c" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="5" y="6" width="22" height="20"><path d="M5 6v19.5h5v-8h4V21h4v-3.5h4V25h5V6h-4.5v4H18v3.5h-4v-4h-4V6H5Z" fill="#FD7000"/></mask><g mask="url(#MistralAI__c)"><path fill="#FF6E00" d="M4 14h25v4H4z"/></g><mask id="MistralAI__d" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="5" y="6" width="22" height="20"><path d="M5 6v19.5h5v-8h4V21h4v-3.5h4V25h5V6h-4.5v4H18v3.5h-4v-4h-4V6H5Z" fill="#FD7000"/></mask><g mask="url(#MistralAI__d)"><path fill="#FF4A09" d="M4 18h25v4H4z"/></g><mask id="MistralAI__e" style="mask-type:alpha" maskUnits="userSpaceOnUse" x="5" y="6" width="22" height="20"><path d="M5 6v19.5h5v-8h4V21h4v-3.5h4V25h5V6h-4.5v4H18v3.5h-4v-4h-4V6H5Z" fill="#FD7000"/></mask><g mask="url(#MistralAI__e)"><path fill="#FE060F" d="M4 22h25v4H4z"/></g><path d="M21 18v7h1" stroke="#000" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/><path d="M5 6v19.5h5v-8h4V21h4v-3.5h4V25h5V6h-4.5v4H18v3.5h-4v-4h-4V6H5Z" stroke="#000" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"/></svg>
|
||||
|
After Width: | Height: | Size: 1.8 KiB |
|
|
@ -0,0 +1,45 @@
|
|||
import Box from '@mui/material/Box'
|
||||
import PropTypes from 'prop-types'
|
||||
import { Chip } from '@mui/material'
|
||||
import './StarterPromptsCard.css'
|
||||
import { useSelector } from 'react-redux'
|
||||
|
||||
const FollowUpPromptsCard = ({ isGrid, followUpPrompts, sx, onPromptClick }) => {
|
||||
const customization = useSelector((state) => state.customization)
|
||||
|
||||
return (
|
||||
<Box
|
||||
className={'button-container'}
|
||||
sx={{ width: '100%', maxWidth: isGrid ? 'inherit' : '400px', p: 1.5, display: 'flex', gap: 1, ...sx }}
|
||||
>
|
||||
{followUpPrompts.map((fp, index) => (
|
||||
<Chip
|
||||
label={fp}
|
||||
className={'button'}
|
||||
key={index}
|
||||
onClick={(e) => onPromptClick(fp, e)}
|
||||
sx={{
|
||||
backgroundColor: 'transparent',
|
||||
border: '1px solid',
|
||||
boxShadow: '0px 2px 1px -1px rgba(0,0,0,0.2)',
|
||||
color: '#2196f3',
|
||||
transition: 'all 300ms cubic-bezier(0.4, 0, 0.2, 1) 0ms',
|
||||
'&:hover': {
|
||||
backgroundColor: customization.isDarkMode ? 'rgba(0, 0, 0, 0.12)' : 'rgba(0, 0, 0, 0.05)',
|
||||
border: '1px solid'
|
||||
}
|
||||
}}
|
||||
/>
|
||||
))}
|
||||
</Box>
|
||||
)
|
||||
}
|
||||
|
||||
FollowUpPromptsCard.propTypes = {
|
||||
isGrid: PropTypes.bool,
|
||||
followUpPrompts: PropTypes.array,
|
||||
sx: PropTypes.object,
|
||||
onPromptClick: PropTypes.func
|
||||
}
|
||||
|
||||
export default FollowUpPromptsCard
|
||||
|
|
@ -10,6 +10,7 @@ import ChatFeedback from '@/ui-component/extended/ChatFeedback'
|
|||
import AnalyseFlow from '@/ui-component/extended/AnalyseFlow'
|
||||
import StarterPrompts from '@/ui-component/extended/StarterPrompts'
|
||||
import Leads from '@/ui-component/extended/Leads'
|
||||
import FollowUpPrompts from '@/ui-component/extended/FollowUpPrompts'
|
||||
|
||||
const CHATFLOW_CONFIGURATION_TABS = [
|
||||
{
|
||||
|
|
@ -20,6 +21,10 @@ const CHATFLOW_CONFIGURATION_TABS = [
|
|||
label: 'Starter Prompts',
|
||||
id: 'conversationStarters'
|
||||
},
|
||||
{
|
||||
label: 'Follow-up Prompts',
|
||||
id: 'followUpPrompts'
|
||||
},
|
||||
{
|
||||
label: 'Speech to Text',
|
||||
id: 'speechToText'
|
||||
|
|
@ -116,6 +121,7 @@ const ChatflowConfigurationDialog = ({ show, dialogProps, onCancel }) => {
|
|||
<TabPanel key={index} value={tabValue} index={index}>
|
||||
{item.id === 'rateLimiting' && <RateLimit />}
|
||||
{item.id === 'conversationStarters' ? <StarterPrompts dialogProps={dialogProps} /> : null}
|
||||
{item.id === 'followUpPrompts' ? <FollowUpPrompts dialogProps={dialogProps} /> : null}
|
||||
{item.id === 'speechToText' ? <SpeechToText dialogProps={dialogProps} /> : null}
|
||||
{item.id === 'chatFeedback' ? <ChatFeedback dialogProps={dialogProps} /> : null}
|
||||
{item.id === 'allowedDomains' ? <AllowedDomains dialogProps={dialogProps} /> : null}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,527 @@
|
|||
import PropTypes from 'prop-types'
|
||||
import { Box, Button, FormControl, ListItem, ListItemAvatar, ListItemText, MenuItem, Select, Typography } from '@mui/material'
|
||||
import { useEffect, useState } from 'react'
|
||||
import { useDispatch } from 'react-redux'
|
||||
|
||||
// Project Imports
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import { SwitchInput } from '@/ui-component/switch/Switch'
|
||||
import chatflowsApi from '@/api/chatflows'
|
||||
import { closeSnackbar as closeSnackbarAction, enqueueSnackbar as enqueueSnackbarAction, SET_CHATFLOW } from '@/store/actions'
|
||||
import useNotifier from '@/utils/useNotifier'
|
||||
import anthropicIcon from '@/assets/images/anthropic.svg'
|
||||
import azureOpenAiIcon from '@/assets/images/azure_openai.svg'
|
||||
import mistralAiIcon from '@/assets/images/mistralai.svg'
|
||||
import openAiIcon from '@/assets/images/openai.svg'
|
||||
import { TooltipWithParser } from '@/ui-component/tooltip/TooltipWithParser'
|
||||
import CredentialInputHandler from '@/views/canvas/CredentialInputHandler'
|
||||
import { Input } from '@/ui-component/input/Input'
|
||||
import { AsyncDropdown } from '@/ui-component/dropdown/AsyncDropdown'
|
||||
|
||||
// Icons
|
||||
import { IconX } from '@tabler/icons-react'
|
||||
import { Dropdown } from '@/ui-component/dropdown/Dropdown'
|
||||
|
||||
const promptDescription =
|
||||
'Prompt to generate questions based on the conversation history. You can use variable {history} to refer to the conversation history.'
|
||||
const defaultPrompt =
|
||||
'Given the following conversations: {history}. Please help me predict the three most likely questions that human would ask and keeping each question short and concise.'
|
||||
|
||||
// update when adding new providers
|
||||
const FollowUpPromptProviders = {
|
||||
ANTHROPIC: 'chatAnthropic',
|
||||
AZURE_OPENAI: 'azureChatOpenAI',
|
||||
GOOGLE_GENAI: 'chatGoogleGenerativeAI',
|
||||
MISTRALAI: 'chatMistralAI',
|
||||
OPENAI: 'chatOpenAI'
|
||||
}
|
||||
|
||||
const followUpPromptsOptions = {
|
||||
[FollowUpPromptProviders.ANTHROPIC]: {
|
||||
label: 'Anthropic Claude',
|
||||
name: FollowUpPromptProviders.ANTHROPIC,
|
||||
icon: anthropicIcon,
|
||||
inputs: [
|
||||
{
|
||||
label: 'Connect Credential',
|
||||
name: 'credential',
|
||||
type: 'credential',
|
||||
credentialNames: ['anthropicApi']
|
||||
},
|
||||
{
|
||||
label: 'Model Name',
|
||||
name: 'modelName',
|
||||
type: 'asyncOptions',
|
||||
loadMethod: 'listModels'
|
||||
},
|
||||
{
|
||||
label: 'Prompt',
|
||||
name: 'prompt',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description: promptDescription,
|
||||
optional: true,
|
||||
default: defaultPrompt
|
||||
},
|
||||
{
|
||||
label: 'Temperature',
|
||||
name: 'temperature',
|
||||
type: 'number',
|
||||
step: 0.1,
|
||||
optional: true,
|
||||
default: 0.9
|
||||
}
|
||||
]
|
||||
},
|
||||
[FollowUpPromptProviders.AZURE_OPENAI]: {
|
||||
label: 'Azure ChatOpenAI',
|
||||
name: FollowUpPromptProviders.AZURE_OPENAI,
|
||||
icon: azureOpenAiIcon,
|
||||
inputs: [
|
||||
{
|
||||
label: 'Connect Credential',
|
||||
name: 'credential',
|
||||
type: 'credential',
|
||||
credentialNames: ['azureOpenAIApi']
|
||||
},
|
||||
{
|
||||
label: 'Model Name',
|
||||
name: 'modelName',
|
||||
type: 'asyncOptions',
|
||||
loadMethod: 'listModels'
|
||||
},
|
||||
{
|
||||
label: 'Prompt',
|
||||
name: 'prompt',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description: promptDescription,
|
||||
optional: true,
|
||||
default: defaultPrompt
|
||||
},
|
||||
{
|
||||
label: 'Temperature',
|
||||
name: 'temperature',
|
||||
type: 'number',
|
||||
step: 0.1,
|
||||
optional: true,
|
||||
default: 0.9
|
||||
}
|
||||
]
|
||||
},
|
||||
[FollowUpPromptProviders.GOOGLE_GENAI]: {
|
||||
label: 'Google Gemini',
|
||||
name: FollowUpPromptProviders.GOOGLE_GENAI,
|
||||
icon: azureOpenAiIcon,
|
||||
inputs: [
|
||||
{
|
||||
label: 'Connect Credential',
|
||||
name: 'credential',
|
||||
type: 'credential',
|
||||
credentialNames: ['googleGenerativeAI']
|
||||
},
|
||||
{
|
||||
label: 'Model Name',
|
||||
name: 'modelName',
|
||||
type: 'options',
|
||||
default: 'gemini-1.5-pro-latest',
|
||||
options: [
|
||||
{ label: 'gemini-1.5-flash-latest', name: 'gemini-1.5-flash-latest' },
|
||||
{ label: 'gemini-1.5-pro-latest', name: 'gemini-1.5-pro-latest' }
|
||||
]
|
||||
},
|
||||
{
|
||||
label: 'Prompt',
|
||||
name: 'prompt',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description: promptDescription,
|
||||
optional: true,
|
||||
default: defaultPrompt
|
||||
},
|
||||
{
|
||||
label: 'Temperature',
|
||||
name: 'temperature',
|
||||
type: 'number',
|
||||
step: 0.1,
|
||||
optional: true,
|
||||
default: 0.9
|
||||
}
|
||||
]
|
||||
},
|
||||
[FollowUpPromptProviders.MISTRALAI]: {
|
||||
label: 'Mistral AI',
|
||||
name: FollowUpPromptProviders.MISTRALAI,
|
||||
icon: mistralAiIcon,
|
||||
inputs: [
|
||||
{
|
||||
label: 'Connect Credential',
|
||||
name: 'credential',
|
||||
type: 'credential',
|
||||
credentialNames: ['mistralAIApi']
|
||||
},
|
||||
{
|
||||
label: 'Model Name',
|
||||
name: 'modelName',
|
||||
type: 'options',
|
||||
options: [
|
||||
{ label: 'mistral-large-latest', name: 'mistral-large-latest' },
|
||||
{ label: 'mistral-large-2402', name: 'mistral-large-2402' }
|
||||
]
|
||||
},
|
||||
{
|
||||
label: 'Prompt',
|
||||
name: 'prompt',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description: promptDescription,
|
||||
optional: true,
|
||||
default: defaultPrompt
|
||||
},
|
||||
{
|
||||
label: 'Temperature',
|
||||
name: 'temperature',
|
||||
type: 'number',
|
||||
step: 0.1,
|
||||
optional: true,
|
||||
default: 0.9
|
||||
}
|
||||
]
|
||||
},
|
||||
[FollowUpPromptProviders.OPENAI]: {
|
||||
label: 'OpenAI',
|
||||
name: FollowUpPromptProviders.OPENAI,
|
||||
icon: openAiIcon,
|
||||
inputs: [
|
||||
{
|
||||
label: 'Connect Credential',
|
||||
name: 'credential',
|
||||
type: 'credential',
|
||||
credentialNames: ['openAIApi']
|
||||
},
|
||||
{
|
||||
label: 'Model Name',
|
||||
name: 'modelName',
|
||||
type: 'asyncOptions',
|
||||
loadMethod: 'listModels'
|
||||
},
|
||||
{
|
||||
label: 'Prompt',
|
||||
name: 'prompt',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description: promptDescription,
|
||||
optional: true,
|
||||
default: defaultPrompt
|
||||
},
|
||||
{
|
||||
label: 'Temperature',
|
||||
name: 'temperature',
|
||||
type: 'number',
|
||||
step: 0.1,
|
||||
optional: true,
|
||||
default: 0.9
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
const FollowUpPrompts = ({ dialogProps }) => {
|
||||
const dispatch = useDispatch()
|
||||
|
||||
useNotifier()
|
||||
|
||||
const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args))
|
||||
const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args))
|
||||
|
||||
const [followUpPromptsConfig, setFollowUpPromptsConfig] = useState({})
|
||||
const [chatbotConfig, setChatbotConfig] = useState({})
|
||||
const [selectedProvider, setSelectedProvider] = useState('none')
|
||||
|
||||
const handleChange = (key, value) => {
|
||||
setFollowUpPromptsConfig({
|
||||
...followUpPromptsConfig,
|
||||
[key]: value
|
||||
})
|
||||
}
|
||||
|
||||
const handleSelectedProviderChange = (event) => {
|
||||
const selectedProvider = event.target.value
|
||||
setSelectedProvider(selectedProvider)
|
||||
handleChange('selectedProvider', selectedProvider)
|
||||
}
|
||||
|
||||
const setValue = (value, providerName, inputParamName) => {
|
||||
let newVal = {}
|
||||
if (!Object.prototype.hasOwnProperty.call(followUpPromptsConfig, providerName)) {
|
||||
newVal = { ...followUpPromptsConfig, [providerName]: {} }
|
||||
} else {
|
||||
newVal = { ...followUpPromptsConfig }
|
||||
}
|
||||
|
||||
newVal[providerName][inputParamName] = value
|
||||
if (inputParamName === 'status' && value === true) {
|
||||
// ensure that the others are turned off
|
||||
Object.keys(followUpPromptsOptions).forEach((key) => {
|
||||
const provider = followUpPromptsOptions[key]
|
||||
if (provider.name !== providerName) {
|
||||
newVal[provider.name] = { ...followUpPromptsConfig[provider.name], status: false }
|
||||
}
|
||||
})
|
||||
}
|
||||
setFollowUpPromptsConfig(newVal)
|
||||
return newVal
|
||||
}
|
||||
|
||||
const onSave = async () => {
|
||||
// TODO: saving without changing the prompt will not save the prompt
|
||||
try {
|
||||
let value = {
|
||||
followUpPrompts: { status: followUpPromptsConfig.status }
|
||||
}
|
||||
chatbotConfig.followUpPrompts = value.followUpPrompts
|
||||
|
||||
// if the prompt is not set, save the default prompt
|
||||
if (!followUpPromptsConfig[followUpPromptsConfig.selectedProvider].prompt) {
|
||||
followUpPromptsConfig[followUpPromptsConfig.selectedProvider].prompt = followUpPromptsOptions[
|
||||
followUpPromptsConfig.selectedProvider
|
||||
].inputs.find((input) => input.name === 'prompt').default
|
||||
}
|
||||
|
||||
if (!followUpPromptsConfig[followUpPromptsConfig.selectedProvider].temperature) {
|
||||
followUpPromptsConfig[followUpPromptsConfig.selectedProvider].temperature = followUpPromptsOptions[
|
||||
followUpPromptsConfig.selectedProvider
|
||||
].inputs.find((input) => input.name === 'temperature').default
|
||||
}
|
||||
|
||||
const saveResp = await chatflowsApi.updateChatflow(dialogProps.chatflow.id, {
|
||||
chatbotConfig: JSON.stringify(chatbotConfig),
|
||||
followUpPrompts: JSON.stringify(followUpPromptsConfig)
|
||||
})
|
||||
if (saveResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'Follow-up Prompts configuration saved',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
dispatch({ type: SET_CHATFLOW, chatflow: saveResp.data })
|
||||
}
|
||||
} catch (error) {
|
||||
const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}`
|
||||
enqueueSnackbar({
|
||||
message: `Failed to save follow-up prompts configuration: ${errorData}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
persist: true,
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (dialogProps.chatflow && dialogProps.chatflow.followUpPrompts) {
|
||||
let chatbotConfig = JSON.parse(dialogProps.chatflow.chatbotConfig)
|
||||
let followUpPromptsConfig = JSON.parse(dialogProps.chatflow.followUpPrompts)
|
||||
setChatbotConfig(chatbotConfig || {})
|
||||
if (followUpPromptsConfig) {
|
||||
setFollowUpPromptsConfig(followUpPromptsConfig)
|
||||
setSelectedProvider(followUpPromptsConfig.selectedProvider)
|
||||
}
|
||||
}
|
||||
|
||||
return () => {}
|
||||
}, [dialogProps])
|
||||
|
||||
const checkDisabled = () => {
|
||||
if (followUpPromptsConfig && followUpPromptsConfig.status) {
|
||||
if (selectedProvider === 'none') {
|
||||
return true
|
||||
}
|
||||
const provider = followUpPromptsOptions[selectedProvider]
|
||||
for (let inputParam of provider.inputs) {
|
||||
if (!inputParam.optional) {
|
||||
const param = inputParam.name === 'credential' ? 'credentialId' : inputParam.name
|
||||
if (
|
||||
!followUpPromptsConfig[selectedProvider] ||
|
||||
!followUpPromptsConfig[selectedProvider][param] ||
|
||||
followUpPromptsConfig[selectedProvider][param] === ''
|
||||
) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<Box
|
||||
sx={{
|
||||
width: '100%',
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
alignItems: 'start',
|
||||
justifyContent: 'start',
|
||||
gap: 3,
|
||||
mb: 2
|
||||
}}
|
||||
>
|
||||
<SwitchInput
|
||||
label='Enable Follow-up Prompts'
|
||||
onChange={(value) => handleChange('status', value)}
|
||||
value={followUpPromptsConfig.status}
|
||||
/>
|
||||
{followUpPromptsConfig && followUpPromptsConfig.status && (
|
||||
<>
|
||||
<Typography variant='h5'>Providers</Typography>
|
||||
<FormControl fullWidth>
|
||||
<Select size='small' value={selectedProvider} onChange={handleSelectedProviderChange}>
|
||||
<MenuItem value='none'>None</MenuItem>
|
||||
{Object.values(followUpPromptsOptions).map((provider) => (
|
||||
<MenuItem key={provider.name} value={provider.name}>
|
||||
{provider.label}
|
||||
</MenuItem>
|
||||
))}
|
||||
</Select>
|
||||
</FormControl>
|
||||
{selectedProvider !== 'none' && (
|
||||
<>
|
||||
<ListItem sx={{ p: 0 }} alignItems='center'>
|
||||
<ListItemAvatar>
|
||||
<div
|
||||
style={{
|
||||
width: 50,
|
||||
height: 50,
|
||||
borderRadius: '50%',
|
||||
backgroundColor: 'white'
|
||||
}}
|
||||
>
|
||||
<img
|
||||
style={{
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
padding: 10,
|
||||
objectFit: 'contain'
|
||||
}}
|
||||
alt='AI'
|
||||
src={followUpPromptsOptions[selectedProvider].icon}
|
||||
/>
|
||||
</div>
|
||||
</ListItemAvatar>
|
||||
<ListItemText
|
||||
primary={followUpPromptsOptions[selectedProvider].label}
|
||||
secondary={
|
||||
<a target='_blank' rel='noreferrer' href={followUpPromptsOptions[selectedProvider].url}>
|
||||
{followUpPromptsOptions[selectedProvider].url}
|
||||
</a>
|
||||
}
|
||||
/>
|
||||
</ListItem>
|
||||
{followUpPromptsOptions[selectedProvider].inputs.map((inputParam, index) => (
|
||||
<Box key={index} sx={{ px: 2, width: '100%' }}>
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<Typography>
|
||||
{inputParam.label}
|
||||
{!inputParam.optional && <span style={{ color: 'red' }}> *</span>}
|
||||
{inputParam.description && (
|
||||
<TooltipWithParser style={{ marginLeft: 10 }} title={inputParam.description} />
|
||||
)}
|
||||
</Typography>
|
||||
</div>
|
||||
{inputParam.type === 'credential' && (
|
||||
<CredentialInputHandler
|
||||
key={`${selectedProvider}-${inputParam.name}`}
|
||||
data={
|
||||
followUpPromptsConfig[selectedProvider]?.credentialId
|
||||
? { credential: followUpPromptsConfig[selectedProvider].credentialId }
|
||||
: {}
|
||||
}
|
||||
inputParam={inputParam}
|
||||
onSelect={(newValue) => setValue(newValue, selectedProvider, 'credentialId')}
|
||||
/>
|
||||
)}
|
||||
|
||||
{(inputParam.type === 'string' ||
|
||||
inputParam.type === 'password' ||
|
||||
inputParam.type === 'number') && (
|
||||
<Input
|
||||
key={`${selectedProvider}-${inputParam.name}`}
|
||||
inputParam={inputParam}
|
||||
onChange={(newValue) => setValue(newValue, selectedProvider, inputParam.name)}
|
||||
value={
|
||||
followUpPromptsConfig[selectedProvider] &&
|
||||
followUpPromptsConfig[selectedProvider][inputParam.name]
|
||||
? followUpPromptsConfig[selectedProvider][inputParam.name]
|
||||
: inputParam.default ?? ''
|
||||
}
|
||||
/>
|
||||
)}
|
||||
|
||||
{inputParam.type === 'asyncOptions' && (
|
||||
<>
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<AsyncDropdown
|
||||
key={`${selectedProvider}-${inputParam.name}`}
|
||||
name={inputParam.name}
|
||||
nodeData={{
|
||||
name: followUpPromptsOptions[selectedProvider].name,
|
||||
inputParams: followUpPromptsOptions[selectedProvider].inputs
|
||||
}}
|
||||
value={
|
||||
followUpPromptsConfig[selectedProvider] &&
|
||||
followUpPromptsConfig[selectedProvider][inputParam.name]
|
||||
? followUpPromptsConfig[selectedProvider][inputParam.name]
|
||||
: inputParam.default ?? 'choose an option'
|
||||
}
|
||||
onSelect={(newValue) => setValue(newValue, selectedProvider, inputParam.name)}
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
{inputParam.type === 'options' && (
|
||||
<Dropdown
|
||||
name={inputParam.name}
|
||||
options={inputParam.options}
|
||||
onSelect={(newValue) => setValue(newValue, selectedProvider, inputParam.name)}
|
||||
value={
|
||||
followUpPromptsConfig[selectedProvider] &&
|
||||
followUpPromptsConfig[selectedProvider][inputParam.name]
|
||||
? followUpPromptsConfig[selectedProvider][inputParam]
|
||||
: inputParam.default ?? 'choose an option'
|
||||
}
|
||||
/>
|
||||
)}
|
||||
</Box>
|
||||
))}
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</Box>
|
||||
<StyledButton disabled={checkDisabled()} variant='contained' onClick={onSave}>
|
||||
Save
|
||||
</StyledButton>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
FollowUpPrompts.propTypes = {
|
||||
dialogProps: PropTypes.object
|
||||
}
|
||||
|
||||
export default FollowUpPrompts
|
||||
|
|
@ -38,7 +38,8 @@ import {
|
|||
IconSquareFilled,
|
||||
IconDeviceSdCard,
|
||||
IconCheck,
|
||||
IconPaperclip
|
||||
IconPaperclip,
|
||||
IconSparkles
|
||||
} from '@tabler/icons-react'
|
||||
import robotPNG from '@/assets/images/robot.png'
|
||||
import userPNG from '@/assets/images/account.png'
|
||||
|
|
@ -79,6 +80,7 @@ import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackba
|
|||
// Utils
|
||||
import { isValidURL, removeDuplicateURL, setLocalStorageChatflow, getLocalStorageChatflow } from '@/utils/genericHelper'
|
||||
import useNotifier from '@/utils/useNotifier'
|
||||
import FollowUpPromptsCard from '@/ui-component/cards/FollowUpPromptsCard'
|
||||
|
||||
const messageImageStyle = {
|
||||
width: '128px',
|
||||
|
|
@ -202,6 +204,10 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
const [isLeadSaving, setIsLeadSaving] = useState(false)
|
||||
const [isLeadSaved, setIsLeadSaved] = useState(false)
|
||||
|
||||
// follow-up prompts
|
||||
const [followUpPromptsStatus, setFollowUpPromptsStatus] = useState(false)
|
||||
const [followUpPrompts, setFollowUpPrompts] = useState([])
|
||||
|
||||
// drag & drop and file input
|
||||
const imgUploadRef = useRef(null)
|
||||
const fileUploadRef = useRef(null)
|
||||
|
|
@ -627,6 +633,12 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
handleSubmit(undefined, promptStarterInput)
|
||||
}
|
||||
|
||||
const handleFollowUpPromptClick = async (promptStarterInput) => {
|
||||
setUserInput(promptStarterInput)
|
||||
setFollowUpPrompts([])
|
||||
handleSubmit(undefined, promptStarterInput)
|
||||
}
|
||||
|
||||
const handleActionClick = async (elem, action) => {
|
||||
setUserInput(elem.label)
|
||||
setMessages((prevMessages) => {
|
||||
|
|
@ -664,6 +676,11 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
return allMessages
|
||||
})
|
||||
}
|
||||
|
||||
if (data.followUpPrompts) {
|
||||
const followUpPrompts = JSON.parse(data.followUpPrompts)
|
||||
setFollowUpPrompts(followUpPrompts)
|
||||
}
|
||||
}
|
||||
|
||||
// Handle form submission
|
||||
|
|
@ -954,6 +971,7 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
}
|
||||
})
|
||||
}
|
||||
if (message.followUpPrompts) obj.followUpPrompts = JSON.parse(message.followUpPrompts)
|
||||
return obj
|
||||
})
|
||||
setMessages((prevMessages) => [...prevMessages, ...loadedMessages])
|
||||
|
|
@ -1013,6 +1031,10 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
})
|
||||
}
|
||||
}
|
||||
|
||||
if (config.followUpPrompts) {
|
||||
setFollowUpPromptsStatus(config.followUpPrompts.status)
|
||||
}
|
||||
}
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
|
|
@ -1078,6 +1100,17 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
// eslint-disable-next-line
|
||||
}, [previews])
|
||||
|
||||
useEffect(() => {
|
||||
if (followUpPromptsStatus && messages.length > 0) {
|
||||
const lastMessage = messages[messages.length - 1]
|
||||
if (lastMessage.type === 'apiMessage' && lastMessage.followUpPrompts) {
|
||||
setFollowUpPrompts(lastMessage.followUpPrompts)
|
||||
} else if (lastMessage.type === 'userMessage') {
|
||||
setFollowUpPrompts([])
|
||||
}
|
||||
}
|
||||
}, [followUpPromptsStatus, messages])
|
||||
|
||||
const copyMessageToClipboard = async (text) => {
|
||||
try {
|
||||
await navigator.clipboard.writeText(text || '')
|
||||
|
|
@ -1970,6 +2003,26 @@ export const ChatMessage = ({ open, chatflowid, isAgentCanvas, isDialog, preview
|
|||
</div>
|
||||
)}
|
||||
|
||||
{messages && messages.length > 2 && followUpPromptsStatus && followUpPrompts.length > 0 && (
|
||||
<>
|
||||
<Divider sx={{ width: '100%' }} />
|
||||
<Box sx={{ display: 'flex', flexDirection: 'column', position: 'relative', pt: 1.5 }}>
|
||||
<Stack sx={{ flexDirection: 'row', alignItems: 'center', px: 1.5, gap: 0.5 }}>
|
||||
<IconSparkles size={12} />
|
||||
<Typography sx={{ fontSize: '0.75rem' }} variant='body2'>
|
||||
Try these prompts
|
||||
</Typography>
|
||||
</Stack>
|
||||
<FollowUpPromptsCard
|
||||
sx={{ bottom: previews && previews.length > 0 ? 70 : 0 }}
|
||||
followUpPrompts={followUpPrompts || []}
|
||||
onPromptClick={handleFollowUpPromptClick}
|
||||
isGrid={isDialog}
|
||||
/>
|
||||
</Box>
|
||||
</>
|
||||
)}
|
||||
|
||||
<Divider sx={{ width: '100%' }} />
|
||||
|
||||
<div className='center'>
|
||||
|
|
|
|||
396
pnpm-lock.yaml
396
pnpm-lock.yaml
File diff suppressed because it is too large
Load Diff
Loading…
Reference in New Issue