* feat: Enhance ConditionAgent with conversation history selection options
- Added a new parameter `conversationHistorySelection` to allow users to choose which messages from the conversation history to include in prompts.
- Options include: User Question, Last Conversation Message, All Conversation Messages, and Empty.
- Default selection is set to 'All Conversation Messages' for improved context management in sequential LLM and Agent nodes.
* Bump version from 2.0 to 3.0
* Update 'Require Approval' button description to clarify dependency on MemoryAgent
* Refine RequiredApproval message description
* Fix message description by removing unscaped character ('\')
* Break line
* docs: clarify and enhance the "Require Approval" description for Agent node
* chore: slight update of the description
* feat: Add option to disable conversation history
- Add new `disableConversationHistory` boolean parameter in LLMNodes.ts and Agent.ts to optionally skip including conversation history in prompts
- Fix potential error in Agent.ts when messages array is empty by adding null safety checks
- Improve memory efficiency by allowing stateless interactions when history isn't needed
* feat: add conversation history filtering options
Replace the disable conversation history feature with a more flexible filtering system that allows selecting:
- User question only
- Last message only
- All messages (default)
- No messages
This provides more granular control over conversation context management.
* chore: break lines
* chore: removed ending semi columns
* chore: fix eslint errors
* fix(sequentialagents): improve conversation history filtering logic
- Remove unnecessary state.messages check for user_question case
- Add proper null handling for last_message and all_messages cases
- Remove @ts-ignore comments with proper typing
* Update LLMNode.ts
* Update Agent.ts
---------
Co-authored-by: Henry Heng <henryheng@flowiseai.com>
* Added redis open connection if it is closed
* Removed unecessary modification
* Added check connection in all methods
* Renamed method
* added await on method call
* Refactor Redis connection handling: remove singleton pattern, ensure connections are opened and closed per operation.
---------
Co-authored-by: Maicon Matsubara <maicon@fullwise.com.br>
* Added support for state-based metadata filter to Retriever Tool
* Update RetrieverTool.ts
---------
Co-authored-by: Henry Heng <henryheng@flowiseai.com>
* adding support for prometheus and grafana
* open telemetry
* lint fixes
* missing counter and telemetry standardization
---------
Co-authored-by: Henry <hzj94@hotmail.com>
* Set azureOpenAIBasePath to undefined if empty to enforce usage of env variable AZURE_OPENAI_BASE_PATH in @langchain+openai@0.0.30_encoding@0.1.13_langchain@0.2.11/node_modules/@langchain/openai/dist/embeddings.cjs
Fixed a bug in `restructureMessages` leading to blowing up of the message content with escape characters and eventually crashing the flow with "repetitive patterns" error
* Make Unstructured API URL optional when environment variable is present
* Fix empty apiUrl option in Unsctructured flowise loader
* Add focumentation for env vars
* add functionality for full file uploads, add remove messages from view dialog and API
* add attachments swagger
* update question to include uploadedFilesContent
* make config dialog modal lg size
* Refactor ChatOpenAI_ChatModels to include stopSequence parameter
* lint fix
* Stop Sequence String will now be split by comma
* Update ChatOpenAI.ts
---------
Co-authored-by: Henry Heng <henryheng@flowiseai.com>
* feat: Add Alibaba API credential and ChatAlibabaTongyi node
* lint fix
* Add chatAlibabaTongyi model to models.json
and chat models
---------
Co-authored-by: Henry Heng <henryheng@flowiseai.com>
* added Jina AI Embedding support
* Update JinaAIEmbedding.ts
Change model name to string type
* removed jina embeddings
* lint fix
---------
Co-authored-by: Henry Heng <henryheng@flowiseai.com>