Skip to content

fix: re-added message store component output dropdown #8399

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

lucaseduoli
Copy link
Collaborator

@lucaseduoli lucaseduoli commented Jun 6, 2025

This pull request introduces enhancements to the MemoryComponent class and updates the starter project configuration for custom components. Key changes include adding new output methods for retrieving messages in different formats, modifying the structure of outputs in the MemoryComponent, and updating node and edge configurations in the starter project JSON.

Enhancements to MemoryComponent:

  • New output methods: Added retrieve_messages_as_text to convert stored messages into text format using the data_to_text helper.
  • Expanded outputs: Updated the outputs list to include Data, Message, and DataFrame formats, allowing more flexibility in retrieving and processing stored messages. [1] [2]

Updates to starter project configuration:

  • Node and edge IDs: Replaced old IDs with new unique identifiers for nodes and edges in Custom Component Maker.json to ensure proper linkage and compatibility. [1] [2]
  • Output definitions: Added new output configurations for Data and Message types, aligning with the updated MemoryComponent outputs.

Summary by CodeRabbit

  • New Features

    • Enhanced the Memory component to support multiple output formats when retrieving messages, including raw data, formatted text, and DataFrame.
    • Added a new method for retrieving messages as formatted text.
  • Bug Fixes

    • Improved dropdown selection in frontend regression tests to ensure correct element interaction.
  • Chores

    • Updated starter project templates to include enhanced Memory component functionality and set default API key values for relevant nodes.
  • Refactor

    • Simplified output selection logic and removed obsolete output selection methods across backend and utility code.
    • Removed the TypeConverter component and related references to streamline workflows.
  • Tests

    • Removed outdated unit tests related to output selection.
    • Updated tests to reflect removal of the TypeConverter component and adjusted graph structures accordingly.

@lucaseduoli lucaseduoli self-assigned this Jun 6, 2025
@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label Jun 6, 2025
Copy link
Contributor

coderabbitai bot commented Jun 6, 2025

Walkthrough

The changes introduce enhanced output handling for the Memory component, supporting multiple output formats (raw data, formatted text, DataFrame) and a new method for retrieving messages as formatted text. Output selection logic is refactored, removing the set_selected method and updating related backend and frontend code, configuration files, and tests accordingly.

Changes

File(s) Change Summary
src/backend/base/langflow/components/helpers/memory.py MemoryComponent updated to support three outputs (raw, text, DataFrame); added retrieve_messages_as_text method; dynamic outputs in "Retrieve" mode; imports updated.
src/backend/base/langflow/custom/custom_component/component.py Removed calls to output.set_selected() in output type handling logic.
src/backend/base/langflow/custom/utils.py Refactored output selection: removed unconditional set_selected() calls; now sets selected output based on config matching.
src/backend/base/langflow/template/field/base.py Removed set_selected method from the Output class.
src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json Updated node/edge IDs; enhanced Memory component with multiple outputs and new methods; removed TypeConverterComponent; adjusted node positions and API key inputs.
src/backend/base/langflow/initial_setup/starter_projects/Meeting Summary.json Updated node/edge IDs; enhanced Memory component with multiple outputs and methods; API key inputs for several components set to load from DB with default values; updated ChatInput/Output defaults.
src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json Updated node/edge IDs; enhanced Memory component with multiple outputs and methods; OpenAIModel API key default set; minor adjustments to other nodes.
src/backend/tests/unit/test_schema.py Removed test method for Output.set_selected.
src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts Updated test to explicitly select the first matching dropdown elements for output memory to avoid ambiguity.
src/backend/tests/unit/graph/graph/test_graph_state_model.py Removed usage of TypeConverterComponent; updated PromptComponent context to use retrieve_messages_as_text directly; adjusted expected keys accordingly.
src/backend/tests/unit/initial_setup/starter_projects/test_memory_chatbot.py Removed TypeConverterComponent usage; updated graph edges and execution order; replaced type conversion with direct message retrieval as text.
src/backend/tests/unit/graph/test_graph.py Minor formatting and assertion simplifications; no logic changes.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant MemoryComponent
    participant ExternalMemory

    User->>MemoryComponent: Retrieve messages (mode = "Retrieve")
    alt Output: messages (raw)
        MemoryComponent->>ExternalMemory: aget_messages()
        ExternalMemory-->>MemoryComponent: messages (raw data)
        MemoryComponent-->>User: messages (raw data)
    else Output: messages_text (formatted)
        MemoryComponent->>ExternalMemory: aget_messages()
        ExternalMemory-->>MemoryComponent: messages (raw data)
        MemoryComponent->>MemoryComponent: Format messages as text (data_to_text)
        MemoryComponent-->>User: Message (formatted text)
    else Output: dataframe
        MemoryComponent->>ExternalMemory: aget_messages()
        ExternalMemory-->>MemoryComponent: messages (raw data)
        MemoryComponent->>MemoryComponent: Convert to DataFrame
        MemoryComponent-->>User: DataFrame
    end
Loading

Possibly related PRs

  • feat: combine chat history and chat store into a CRUD component #8323: Refactors MemoryComponent to support two modes ("Retrieve" and "Store") with dynamic inputs and outputs and adds message storing functionality; both modify the same class and its outputs and methods for message retrieval and formatting, indicating a direct and strong code-level relationship.

Suggested labels

enhancement, size:L

Suggested reviewers

  • edwinjosechittilappilly
  • mfortman11

📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ce7a953 and 6c2e46c.

📒 Files selected for processing (1)
  • src/backend/tests/unit/graph/test_graph.py (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • src/backend/tests/unit/graph/test_graph.py
⏰ Context from checks skipped due to timeout of 90000ms (5)
  • GitHub Check: Filter Paths
  • GitHub Check: Optimize new Python code in this PR
  • GitHub Check: Update Starter Projects
  • GitHub Check: Run Ruff Check and Format
  • GitHub Check: Ruff Style Check (3.13)
✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@dosubot dosubot bot added the enhancement New feature or request label Jun 6, 2025
@github-actions github-actions bot added bug Something isn't working and removed enhancement New feature or request labels Jun 6, 2025
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR reintroduces the message store component’s output dropdown in both code and configuration.

  • Updates frontend test selectors to click the first matching dropdown and item.
  • Enhances MemoryComponent with three distinct output methods (Data, Message, DataFrame) and adds retrieve_messages_as_text.
  • Updates starter-project JSONs (Memory Chatbot.json and Custom Component Maker.json) with new node/edge IDs and aligned output definitions.

Reviewed Changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.

File Description
src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts Adjusted test selectors to .first() on dropdowns
src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json Updated node/edge IDs and added new output configs
src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json Replaced IDs for nodes/edges and removed legacy paths
src/backend/base/langflow/components/helpers/memory.py Imported data_to_text, added outputs for text/Data
Comments suppressed due to low confidence (1)

src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts:92

  • With the addition of 'Data' and 'DataFrame' output options in MemoryComponent, consider adding test steps that open the dropdown and select those items as well to ensure full coverage of the new outputs.
await page.getByTestId("dropdown-output-memory").first().click();

@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Jun 6, 2025
@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Jun 6, 2025
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
src/backend/base/langflow/components/helpers/memory.py (1)

124-126: Add missing dynamic=True for consistency.

The outputs in the update_outputs method are missing the dynamic=True flag that's present in the class-level outputs definition. This inconsistency could lead to unexpected UI behavior.

Apply this diff to maintain consistency:

-                    Output(display_name="Data", name="messages", method="retrieve_messages"),
-                    Output(display_name="Message", name="messages_text", method="retrieve_messages_as_text"),
-                    Output(display_name="DataFrame", name="dataframe", method="retrieve_messages_dataframe"),
+                    Output(display_name="Data", name="messages", method="retrieve_messages", dynamic=True),
+                    Output(display_name="Message", name="messages_text", method="retrieve_messages_as_text", dynamic=True),
+                    Output(display_name="DataFrame", name="dataframe", method="retrieve_messages_dataframe", dynamic=True),
🧹 Nitpick comments (2)
src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (1)

1380-1403: MemoryComponent outputs expanded.
The Memory node now exposes three outputs—Data (messages), Message (messages_text), and DataFrame (dataframe). This aligns with the enhanced MemoryComponent API; ensure documentation snippets and example flows are updated to demonstrate selecting each output type.

src/backend/base/langflow/initial_setup/starter_projects/Meeting Summary.json (1)

2640-2646: Implementation of retrieve_messages_as_text
The new asynchronous method correctly uses data_to_text(self.template, await self.retrieve_messages()) and wraps the result in a Message. Consider adding error handling for empty or malformed message lists to prevent runtime failures in downstream components.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e23e543 and 9ee3a4c.

⛔ Files ignored due to path filters (1)
  • src/frontend/package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (9)
  • src/backend/base/langflow/components/helpers/memory.py (4 hunks)
  • src/backend/base/langflow/custom/custom_component/component.py (0 hunks)
  • src/backend/base/langflow/custom/utils.py (1 hunks)
  • src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json (27 hunks)
  • src/backend/base/langflow/initial_setup/starter_projects/Meeting Summary.json (42 hunks)
  • src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (18 hunks)
  • src/backend/base/langflow/template/field/base.py (0 hunks)
  • src/backend/tests/unit/test_schema.py (0 hunks)
  • src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts (1 hunks)
💤 Files with no reviewable changes (3)
  • src/backend/tests/unit/test_schema.py
  • src/backend/base/langflow/template/field/base.py
  • src/backend/base/langflow/custom/custom_component/component.py
⏰ Context from checks skipped due to timeout of 90000ms (3)
  • GitHub Check: Optimize new Python code in this PR
  • GitHub Check: Run Ruff Check and Format
  • GitHub Check: Update Starter Projects
🔇 Additional comments (41)
src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts (1)

92-92: Good defensive test programming.

Adding .first() ensures the test selects the intended dropdown element when multiple outputs are available, which aligns with the enhanced MemoryComponent that now exposes multiple output formats.

Also applies to: 94-97

src/backend/base/langflow/custom/utils.py (1)

383-388: Improved output selection handling.

The explicit approach of finding the selected output from configuration and setting the selected attribute directly is cleaner than the previous method-based approach.

src/backend/base/langflow/components/helpers/memory.py (3)

4-4: Good addition of required import.

The data_to_text import is necessary for the new text formatting functionality.


101-105: LGTM! Multiple output formats enhance component flexibility.

The three output formats (Data, Message, DataFrame) provide good flexibility for different use cases. All outputs correctly include dynamic=True.


172-175: Well-implemented text formatting method.

The new retrieve_messages_as_text method correctly:

  • Uses async/await for the retrieval operation
  • Formats data using the component's template
  • Updates component status
  • Returns properly wrapped result
src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json (20)

10-31: Edge ID and handles updated correctly for URL → Prompt mapping

The new unique edge ID and handle strings follow the established pattern and align the URL-raIKe node’s text output with the Prompt component’s EXAMPLE_COMPONENTS input.


39-61: Edge ID and handles updated correctly for second URL → Prompt mapping

Consistent with the previous edge update, the URL-j5F0d text output now correctly connects to the BASE_COMPONENT_CODE input of the Prompt node.


68-89: Edge ID and handles updated correctly for third URL → Prompt mapping

The URL-JjyEG node’s text output is now properly linked to the CUSTOM_COMPONENT_CODE input of the Prompt node with a unique edge ID.


105-118: Edge updated for Prompt → Anthropic Model mapping

The Prompt component’s prompt output now correctly connects to the AnthropicModel’s input_value field with a new unique edge ID.


142-147: Edge updated for Anthropic Model → Chat Output mapping

The AnthropicModel’s text_output now routes to the ChatOutput component’s input_value, and the edge ID is updated accordingly.


171-176: Edge updated for ChatInput → Prompt mapping

The ChatInput component’s message output is now linked to the Prompt node’s USER_INPUT input with a refreshed edge ID.


244-252: Memory outputs: confirm Data output definition

The messages output is correctly defined as a Data type with caching enabled and proper method linkage.


254-267: Memory outputs: confirm Message output definition

The messages_text output is correctly added, and set as the default selected Message format.


499-506: Memory node repositioned with new ID

The Memory component node has been assigned a new unique ID (Memory-XWOq3) and repositioned. The ID matches the updated edge definitions.


520-520: Prompt node ID updated consistently

The Prompt component node’s ID (Prompt-sBJbI) is refreshed to align with the revised graph references.


767-767: Note node ID changed

Note IDs have been regenerated (note-i6i1L). No functional impact on component logic.


797-797: Note node repositioned and ID updated

The second note node (note-BOaIr) has an updated ID and coordinates. This only affects layout.

Also applies to: 815-816


832-832: URL-j5F0d node ID refreshed

The URL component instance ID has been updated for uniqueness.


1024-1024: URL-raIKe node ID refreshed

The URL component instance ID has been updated for uniqueness.


1222-1222: URL-JjyEG node ID refreshed

The URL component instance ID has been updated for uniqueness.


1501-1501: AnthropicModel API key: enable load_from_db

Setting load_from_db: true is correct for sourcing the API key from the database by default.


1510-1510: AnthropicModel API key default value set

The API key field now defaults to "ANTHROPIC_API_KEY", matching the new secure default.


1752-1752: AnthropicModel node selected by default

Marking the AnthropicModel node as selected: true aligns with the updated tool preference.


1478-1478:

✅ Verification successful

Verify inclusion of options field in outputs

The options: null property has been added to AnthropicModel’s second output. Please ensure consistency with frontend expectations.


🏁 Script executed:

#!/bin/bash
# Verify if `options` is used in other starter project JSONs
rg -n '"options": null' -A2 src/backend/base/langflow/initial_setup/starter_projects

Length of output: 7444


No changes required: options field is consistent across starter projects
A search shows "options": null appears in every starter project JSON under src/backend/base/langflow/initial_setup/starter_projects (e.g. Custom Component Maker at lines 1462/1478, Youtube Analysis at 2361, Travel Planning Agents at 1016/1268/1421, etc.). The new entry in Custom Component Maker follows this pattern. Please ensure the frontend gracefully handles options: null, but no code changes are needed here.


1462-1462:

❓ Verification inconclusive

Verify inclusion of options field in outputs

The options: null property has been added to AnthropicModel’s first output. Confirm this field is supported by the JSON schema and does not break output rendering.


🏁 Script executed:

#!/bin/bash
# Verify if `options` is used in other starter project JSONs
rg -n '"options": null' -A2 src/backend/base/langflow/initial_setup/starter_projects

Length of output: 7444


🏁 Script executed:

#!/bin/bash
# List JSON files under initial_setup directory to locate schema definitions
find src/backend/base/langflow/initial_setup -maxdepth 1 -type f -name '*.json'

Length of output: 81


Consistent options field across starter projects
The "options": null property appears in 15 JSON files under src/backend/base/langflow/initial_setup/starter_projects, so its inclusion here matches the existing pattern.

• No immediate fix required in this file.
• Please confirm that your JSON‐loading logic or schema validation includes (or tolerates) an options field on outputs to avoid runtime errors.
• If there is a centralized JSON schema for these starter definitions, ensure it allows an "options": null entry (or update it accordingly).

src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (13)

10-10: Check edge mapping for ChatInput to OpenAIModel.
The updated edge IDs and handle payloads for ChatInput-AUcQGOpenAIModel-XoX0N correctly reflect the regenerated node identifiers. Confirm that the serialized sourceHandle and targetHandle strings align with the runtime ReactFlow schema to avoid mismatches at graph render time.

Also applies to: 25-30


38-38: Validate Prompt→OpenAIModel edge updates.
The edge connecting Prompt-obNrd.prompt to OpenAIModel-XoX0N.system_message has been re-generated. Input types and fieldName remain correct, but please verify that downstream code consuming the system_message input still functions as expected with the new edge ID.

Also applies to: 46-46, 53-53, 55-58


67-67: Review OpenAIModel→ChatOutput edge changes.
This edge now exposes all three input types (Data, DataFrame, Message) on ChatOutput-ui0vL.input_value, matching the component’s expanded multi-type support. Ensure the frontend dropdown and handler logic correctly present and handle these options.

Also applies to: 75-75, 83-83, 85-89


94-97: Inspect Memory→Prompt edge for new text output.
The Memory-eFLbY.messages_text output is now wired into Prompt-obNrd.memory with inputTypes [“Message”, “Text”]. Verify that “Text” is recognized by the Prompt component’s input schema (it previously accepted only Message) and that conversion logic produces the intended plain-text context.

Also applies to: 103-116


122-123: ChatInput node ID updated.
Node ChatInput-AUcQG ID changed—ensure any saved templates or automated tests referencing the old ID are updated accordingly.


427-428: ChatOutput node ID updated.
Node ChatOutput-ui0vL ID has been regenerated. Confirm no broken links in starter projects or regression tests rely on the previous identifier.


729-730: Note node note-vPnjW ID regeneration.
Auto-generated note node ID change; no schema impact.


764-765: Note node note-TCLyL ID regeneration.
Another auto-generated note node—no action required.


796-797: Prompt node ID updated.
Prompt-obNrd ID has changed; update any scripts or fixtures referring to this node.


952-953: OpenAIModel node ID updated.
Identifier OpenAIModel-XoX0N was regenerated. Verify that project references and config integrations use this new ID.


1437-1440: Confirm code template for MemoryComponent.
The code template value now imports data_to_text and defines retrieve_messages_as_text. Please verify that the import path (langflow.helpers.data) and helper signature match the current helper implementation to avoid runtime import errors.


1651-1656: Skip auto-generated measured and position changes.
Updates to measured, position, and viewport reflect UI layout adjustments and require no manual review.


1670-1671: Project ID updated.
Top-level JSON id changed—if any CI or sharing URLs reference the old ID, they should be updated.

src/backend/base/langflow/initial_setup/starter_projects/Meeting Summary.json (3)

263-292: Re-added messages_text edge for Memory → Prompt
This edge restores the connection from the Memory component’s new messages_text output to the Prompt’s history input. The linkage appears correct; ensure the UI dropdown for Memory outputs now includes “Message” and that the Prompt component can handle the incoming formatted text.


2489-2529: Enhanced Memory component outputs
The Memory node now exposes three outputs—Data (messages), Message (messages_text), and DataFrame (dataframe)—allowing users to select the desired format. This aligns with the updated outputs list in MemoryComponent. Confirm that the builder UI reflects these options in the outputs dropdown and that backward compatibility is maintained for flows expecting only the original Data output.


2532-2552: Import of data_to_text helper added
The template code now imports data_to_text from langflow.helpers.data. Verify that this helper is available in the starter-project context and that its API matches the usage in retrieve_messages_as_text.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e23e543 and 9ee3a4c.

⛔ Files ignored due to path filters (1)
  • src/frontend/package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (9)
  • src/backend/base/langflow/components/helpers/memory.py (4 hunks)
  • src/backend/base/langflow/custom/custom_component/component.py (0 hunks)
  • src/backend/base/langflow/custom/utils.py (1 hunks)
  • src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json (27 hunks)
  • src/backend/base/langflow/initial_setup/starter_projects/Meeting Summary.json (42 hunks)
  • src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (18 hunks)
  • src/backend/base/langflow/template/field/base.py (0 hunks)
  • src/backend/tests/unit/test_schema.py (0 hunks)
  • src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts (1 hunks)
💤 Files with no reviewable changes (3)
  • src/backend/tests/unit/test_schema.py
  • src/backend/base/langflow/template/field/base.py
  • src/backend/base/langflow/custom/custom_component/component.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/backend/base/langflow/components/helpers/memory.py (2)
src/backend/base/langflow/helpers/data.py (1)
  • data_to_text (107-120)
src/backend/base/langflow/schema/message.py (1)
  • Message (38-288)
⏰ Context from checks skipped due to timeout of 90000ms (3)
  • GitHub Check: Optimize new Python code in this PR
  • GitHub Check: Run Ruff Check and Format
  • GitHub Check: Update Starter Projects
🔇 Additional comments (48)
src/frontend/tests/core/regression/generalBugs-shard-9.spec.ts (1)

92-97: LGTM! Improved test reliability.

The addition of .first() to element selection prevents potential test flakiness when multiple elements share the same test ID. This change aligns well with the enhanced MemoryComponent that now supports multiple output formats.

src/backend/base/langflow/components/helpers/memory.py (4)

4-4: LGTM! Good addition for text formatting functionality.

The import of data_to_text enables the new text formatting capability in the component.


101-105: Excellent! Multiple output formats now supported.

The component now provides three output formats (Data, Message, DataFrame) with proper dynamic=True flags. This addresses the previous feedback about missing dynamic flags and significantly enhances the component's flexibility.


124-126: LGTM! Consistent output handling in update_outputs.

The dynamic output display logic correctly shows all three output formats when in "Retrieve" mode.


172-175: Well-implemented text formatting method.

The new retrieve_messages_as_text method efficiently reuses the existing retrieve_messages functionality and properly formats the output using the template. The status update and Message object creation are handled correctly.

src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json (9)

39-61: Verify edge mappings for URL-j5F0d → Prompt
Similar to the first URL edge, ensure that every instance of URL-j5F0d in source, sourceHandle, and the edge ID references an existing node. This consistency is crucial for the flow to render correctly.

You can reuse the above script replacing URL-raIKe with URL-j5F0d.


1499-1511: Load Anthropic API key from database by default
The api_key field in the AnthropicModel-BC7GB node template has "load_from_db": true and default value "ANTHROPIC_API_KEY". This correctly reflects the PR’s goal of using DB-stored credentials.


212-1498: Skip: Node ID and position updates
The remainder of changes are unique identifier updates, node positions, viewport settings, and other non-functional metadata adjustments. No further review is needed for these segments.

Also applies to: 1512-2386


69-90:

✅ Verification successful

Verify edge mappings for URL-JjyEG → Prompt
The third URL edge uses URL-JjyEG. Confirm that the new ID is correctly registered in the nodes section and that the frontend will be able to resolve the serialized handle object.


🏁 Script executed:

rg -n '"id": "URL-JjyEG"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json
rg -n '"source": "URL-JjyEG"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json

Length of output: 373


Edge mapping for URL-JjyEG → Prompt verified

  • Ripgrep results show "id": "URL-JjyEG" in the nodes section at lines 68, 1222, and 1395.
  • The edge source "source": "URL-JjyEG" appears at line 86.
    The node ID is correctly registered and the serialized handle object will resolve in the frontend as expected.

244-281:

✅ Verification successful

Memory node outputs updated to three formats
The Memory-XWOq3 node now declares three outputs—Data, Message, and DataFrame—matching the updated MemoryComponent API. The default selected output is Message. This aligns with the component class enhancements (new retrieve_messages_as_text and DataFrame support).

Please verify that these JSON definitions exactly mirror the outputs list in src/backend/base/langflow/components/helpers/memory.py:


🏁 Script executed:

ast-grep --pattern $'class MemoryComponent { $$$ outputs = [Output(display_name="Data", ...), Output(display_name="Message", ...), Output(display_name="DataFrame", ...)] $$$ }' src/backend/base/langflow/components/helpers/memory.py

Length of output: 232


Let’s locate and inspect the MemoryComponent definition and its outputs in the source to confirm alignment:


🏁 Script executed:

#!/bin/bash
# 1. Find the MemoryComponent class
rg "class MemoryComponent" -n src/backend/base/langflow/components/helpers/memory.py

# 2. Locate where `outputs` is assigned or defined
rg "outputs\s*=" -n src/backend/base/langflow/components/helpers/memory.py

# 3. List all Output instantiations to see their parameters
rg "Output\(" -n src/backend/base/langflow/components/helpers/memory.py

Length of output: 984


Outputs alignment verified for MemoryComponent
The three outputs (Data, Message, DataFrame) in Custom Component Maker.json exactly match the MemoryComponent definition in src/backend/base/langflow/components/helpers/memory.py (lines 102–104 & 124–126). No further changes are required.


179-205:

✅ Verification successful

Verify edge mappings for Memory → Prompt
A new edge xy-edge__Memory-XWOq3…-Prompt-sBJbI maps the messages_text output to CHAT_HISTORY. Ensure Memory-XWOq3 is present as a node ID, and the Prompt node’s CHAT_HISTORY placeholder is correctly configured to accept "Message" input.


🏁 Script executed:

rg -n '"id": "Memory-XWOq3"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json
rg -n '"fieldName": "CHAT_HISTORY"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json

Length of output: 401


🏁 Script executed:

#!/bin/bash
rg -A5 -B5 '"id": "Prompt-sBJbI"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json

Length of output: 2629


Edge mapping Memory → Prompt verified

  • Node Memory-XWOq3 is present (IDs at lines 183, 212, 499).
  • Prompt-sBJbI’s CHAT_HISTORY targetHandle accepts "Message" (inputTypes include "Message", lines 189–196).

No further action required.


10-31:

✅ Verification successful

Verify edge mappings for URL → Prompt
The edges connecting URL-raIKe to Prompt-sBJbI have been updated with new unique IDs and serialized handle objects. Confirm that these new IDs align with the actual node IDs and that the serialized sourceHandle/targetHandle strings are correctly parsed by the frontend.

Run:


🏁 Script executed:

#!/bin/bash
# Verify that the edge source/target IDs match existing nodes.
rg -n '"source": "URL-raIKe"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json
rg -n '"id": "URL-raIKe"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json

Length of output: 373


🏁 Script executed:

#!/bin/bash
file="src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json"

echo "Checking Prompt node ID:"
rg -n '"id": "Prompt-sBJbI"' -n "$file"

echo -e "\nChecking edge target:"
rg -n '"target": "Prompt-sBJbI"' -n "$file"

echo -e "\nInspecting sourceHandle snippet:"
rg -n '"sourceHandle":' -n "$file" -A2

echo -e "\nInspecting targetHandle snippet:"
rg -n '"targetHandle":' -n "$file" -A2

Length of output: 6217


Edge mappings for URL→Prompt verified and approved
All occurrences of URL-raIKe and Prompt-sBJbI were found as node IDs, and both the object-style and serialized sourceHandle/targetHandle entries correctly reference those IDs. No discrepancies detected. LGTM.


156-176:

✅ Verification successful

Verify edge mappings for ChatInput → Prompt
The edge connecting ChatInput-dl2NG back to the prompt node has been updated. Confirm that ChatInput-dl2NG is defined in nodes and that Prompt-sBJbI accepts "Message" and "Text" as inputs.


🏁 Script executed:

rg -n '"id": "ChatInput-dl2NG"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json

Length of output: 242


I’ll pull the surrounding definitions for both nodes to confirm their mappings.


🏁 Script executed:

#!/bin/bash
FILE="src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json"

echo "— ChatInput-dl2NG node definition —"
rg -n -B5 -A5 '"id": "ChatInput-dl2NG"' "$FILE"

echo
echo "— Prompt-sBJbI node definition —"
rg -n -B5 -A5 '"id": "Prompt-sBJbI"' "$FILE"

Length of output: 3947


Edge mapping ChatInput → Prompt confirmed
ChatInput-dl2NG is defined as a ChatInput node emitting "Message", and Prompt-sBJbI’s target handles (e.g. USER_INPUT, CHAT_HISTORY) accept both "Message" and "Text". No changes required.


97-118:

✅ Verification successful

Verify edge mappings for Prompt → AnthropicModel
The edge from Prompt-sBJbI to AnthropicModel-BC7GB has been remapped. Ensure that AnthropicModel-BC7GB exists in the nodes list and that required inputType "Message" is valid for its input_value field.


🏁 Script executed:

rg -n '"id": "AnthropicModel-BC7GB"' -n src/backend/base/langflow/initial_setup/starter_projects/Custom\ Component\ Maker.json

Length of output: 308


🏁 Script executed:

#!/bin/bash
# Show context around the AnthropicModel node definition to verify its fields and inputTypes
rg -A8 -B8 '"id": "AnthropicModel-BC7GB"' -n 'src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json'

Length of output: 2448


🏁 Script executed:

#!/bin/bash
# Extract the AnthropicModel-BC7GB node definition to inspect its inputs
sed -n '1410,1550p' 'src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json'

Length of output: 4786


🏁 Script executed:

#!/bin/bash
# Verify the definition of the `input_value` field in the template
rg -A5 -B5 '"input_value": {' -n 'src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json'

Length of output: 217


🏁 Script executed:

#!/bin/bash
# Search for the definition of the `input_value` template field using fixed string matching
rg -F -n '"input_value": {' 'src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json'

Length of output: 224


🏁 Script executed:

#!/bin/bash
# Display the template definition for `input_value` to verify supported input_types
sed -n '1545,1585p' 'src/backend/base/langflow/initial_setup/starter_projects/Custom Component Maker.json'

Length of output: 9270


Edge Mapping Confirmed for Prompt → AnthropicModel

  • Node AnthropicModel-BC7GB is defined in the JSON.
  • Its input_value field declares input_types: ["Message"], matching the edge’s "Message" type.

No further action required.

src/backend/base/langflow/initial_setup/starter_projects/Memory Chatbot.json (18)

10-10: Consistent ID update for ChatInput→OpenAIModel edge
The node and handle IDs for the ChatInput-to-OpenAIModel connection have been regenerated correctly and the edge reference is consistent. Verify there are no duplicate edge IDs across the graph.

Also applies to: 18-18, 25-25, 27-30


38-38: Updated Prompt→OpenAIModel edge identifiers
The unique IDs and handle descriptors for the Prompt-to-OpenAIModel connection have been refreshed and align with the new node IDs. Ensure the changes propagate correctly in frontend rendering.

Also applies to: 46-46, 53-53, 55-59


67-67: Refreshed OpenAIModel→ChatOutput edge IDs
Edge source and target IDs reflect the regenerated values and the handle encoding matches the updated types. Confirm no broken references in the runtime.

Also applies to: 74-74, 83-83, 85-88


94-96: Regenerated Memory→Prompt edge mapping
The Memory component output (messages_text) now correctly connects to the Prompt input (memory) with new unique node and edge IDs. Validate that inputTypes include the correct types (Message and Text).

Also applies to: 103-116


122-122: ChatInput node data.id updated
Node ChatInput has a new data.id value; ensure it matches the generated ID used in all related edges.


406-406: ChatInput node instance id refreshed
Node instance id is consistent with its data.id. Confirm no mismatches exist.


427-427: ChatOutput node data.id updated
The ChatOutput node's data.id has been regenerated; verify consistency with all incoming edges.


710-710: ChatOutput node instance id aligned
Instance id matches the data.id value. Ensure it aligns across the graph.


729-729: Note node note-vPnjW ID updated
The standalone note node’s ID has been regenerated. There are no dependencies, so no further action is needed.


740-740: Note node instance id updated
Instance id aligns with the data declaration.


764-764: Note-TCLyL node data.id regenerated
ID update only affects display and has no external impact.


777-777: Note-TCLyL instance id consistency
Instance id correctly reflects the data.id.


796-796: Prompt node data.id updated
Ensure the new data.id for the Prompt node matches all edge references.


933-933: Prompt node instance id refresh
Instance id is consistent with data.id; verify against connected edges.


952-952: OpenAIModel node ID regeneration
Both metadata and instance id values have been updated; confirm all edges reference the new identifiers.

Also applies to: 1331-1331


1046-1046: Default API key placeholder
The api_key field default has been set to "OPENAI_API_KEY". Confirm this environment variable is documented for users and included in setup instructions.


1380-1388: New Data output for MemoryComponent
The messages output now returns a Data type via retrieve_messages. Ensure downstream components correctly consume this type.


1393-1402: Added Message output messages_text
The messages_text output invokes retrieve_messages_as_text to return a formatted Message. Validate that the data_to_text template produces the expected formatting.

src/backend/base/langflow/initial_setup/starter_projects/Meeting Summary.json (16)

10-30: Auto-generated flow edge IDs updated correctly
The edge definitions have new unique IDs and handle paths reflecting the regenerated identifiers for Prompt → OpenAIModel. This aligns with the project’s node/edge re-generation.


38-61: Consistent edge updates for OpenAIModel → ChatOutput
All edges from both OpenAIModel instances to their ChatOutput nodes have refreshed IDs and handle metadata. These updates are expected when regenerating starter flow configurations.


68-90: Edge mappings for second OpenAIModel → ChatOutput updated
The second OpenAIModel-nm6TC now connects to ChatOutput-AuLqC with new edge identifiers. This maintains consistency across duplicated model nodes.


98-119: Edge for ChatInput → Prompt refreshed
The ChatInput-50eRP to Prompt-o9CSB edge has refreshed IDs and handles, as expected after flow re-generation.


127-147: Edge for Prompt → OpenAIModel-nm6TC updated
Prompt o9CSB now links to OpenAIModel nm6TC with new edge metadata. Regeneration is correct.


149-175: AssemblyAI Job Creator → Poller edge refreshed
The flow edge between AssemblyAITranscriptionJobCreator-7GviL and AssemblyAITranscriptionJobPoller-9pXw0 has updated IDs. This matches the other starter projects.


183-204: Polling results → Parser edge regenerated
The edge from the poller to the parser node now has new unique identifiers. This is expected.


211-233: Parser → Prompt (transcribed text) edge refreshed
The mapping from parser-Qeh0K to Prompt-pYU9n is regenerated with new IDs, preserving the intended connection.


241-263: Parser → ChatOutput-BBZ6j edge updated
Parser output now links to ChatOutput-BBZ6j with new edge metadata. Consistent with overall ID refresh.


265-292: New Memory → Prompt edge for messages_text
You’ve added the messages_text edge from Memory-qCviX to Prompt-o9CSB, enabling the new text‐formatted history output. This aligns with the enhanced MemoryComponent.


2492-2514: Expanded Memory outputs: Data, Message, DataFrame
The Memory node now declares three outputs (messages, messages_text, dataframe) corresponding to raw data, formatted text, and DataFrame. This correctly mirrors the updated MemoryComponent API.


2549-2549: Updated embedded MemoryComponent code
The "value" field now contains the enhanced MemoryComponent class, including retrieve_messages_as_text and DataFrame conversion logic. This matches your backend helper implementation.


2756-2758: Memory node ID and measurements refreshed
Memory-qCviX has a new unique ID and updated measured dimensions, consistent with auto-regeneration of the flow.


346-347: Standardized API key inputs to load from DB
Both the AssemblyAI Poller and Creator templates now set load_from_db: true on their api_key inputs and default to "ASSEMBLY_API_KEY". This standardizes the credential handling across starter projects.

Also applies to: 355-356


535-535: OpenAIModel API key default placeholder
The OpenAIModel templates now load api_key from the DB with the default "OPENAI_API_KEY". This change is consistent across both model instances.


821-821: OpenAIModel-Q72AI measured dimensions updated
OpenAIModel-Q72AI has updated UI measurements after flow regeneration. This has no functional impact.

@langflow-ai langflow-ai deleted a comment from coderabbitai bot Jun 6, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jun 6, 2025
@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Jun 6, 2025
@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Jun 6, 2025
@lucaseduoli lucaseduoli enabled auto-merge June 9, 2025 14:09
@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Jun 9, 2025
@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Jun 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working lgtm This PR has been approved by a maintainer size:XXL This PR changes 1000+ lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants