Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: object_info (v2) #1

Merged
merged 9 commits into from
Feb 6, 2025
Merged

RFC: object_info (v2) #1

merged 9 commits into from
Feb 6, 2025

Conversation

huchenlei
Copy link
Member

@huchenlei huchenlei commented Jan 6, 2025

This RFC proposes three key improvements to the ComfyUI API /object_info endpoint:

  1. Lazy loading for COMBO input options to reduce initial payload size
  2. Restructuring node output specifications for better maintainability
  3. Explicit COMBO type definition for clearer client-side handling

@BennyKok
Copy link

BennyKok commented Jan 7, 2025

Questions

  1. Will there be version field for the object info.
  2. Have you thought about including model dependencies from the node level? etc, if the custom nodes depend on sam2 etc on hugging face. Thus knowing the source of the models, it will be easier to download missing models that the custom node requires etc.

Context

How we are working around with ComfyDeploy with object_info in existing ComfyUI.

  1. Static Comfy - When our user build our a new configuration of machine, we execute comfyui, and override the object info function so that we can generate object info with placeholder for file path, cause we run on a serverless volume system, the file paths comes from a cdn.
  2. Native Mode - Access full comfyui like is running locally.

Feedback

By having
"config_name": ("FILE_COMBO", {"folder_path": "configs"}),
It makes it easier for us to replace with where are files are coming from.

Suggestion

  1. It might be good to include an old response format, so we can compare the file format changes.

Overall

Overall liking the direction, Some custom nodes do not yet follow proper folder_path systems, if this migration can force custom nodes dev to upgrade.

@huchenlei
Copy link
Member Author

@BennyKok Thanks for the feedback!

Questions

  1. Will there be version field for the object info.
  • No. We will be using a new API endpoint while keep the old /object_info endpoint serving the exact same content as before.
  1. Have you thought about including model dependencies from the node level? etc, if the custom nodes depend on sam2 etc on hugging face. Thus knowing the source of the models, it will be easier to download missing models that the custom node requires etc.

"CheckpointLoader": {
"input": {
"required": {
"ckpt_name": [
Copy link
Member Author

@huchenlei huchenlei Jan 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is a decision I am not very sure about. Whether we go one step further and flatten the structure here:

From

"combo input": ["COMBO", { options: [1, 2, 3], default: 2}]

to

"combo input":  {"type": "COMBO", "options": [1, 2, 3], "default": 2}

This shall further reduce the complexity parsing the API response, at a cost of an extra named field.

The frontend is already doing this conversion here:
https://github.com/Comfy-Org/ComfyUI_frontend/blob/527561d1488c5ab522b8d9ef14a4b82c97d1f550/src/stores/nodeDefStore.ts#L96-L122

@ltdrdata
Copy link
Member

ltdrdata commented Jan 8, 2025

Maybe we can support this feature based on this modification.
comfyanonymous/ComfyUI#6378

@huchenlei huchenlei changed the title object_info_v2 rfc-01: object_info_v2 Jan 8, 2025
@huchenlei huchenlei changed the title rfc-01: object_info_v2 rfc-01: object_info (v2) Jan 8, 2025
@bezo97
Copy link

bezo97 commented Jan 8, 2025

Here's my 2c to this rfc:

  1. Consider renaming the /object_info endpoint to something more descriptive, eg. /node_definitions or so
  2. You could somehow actively tell node developers that action is required, eg. print a warning when a deprecated endpoint is called
  3. Migration utility - this feels excessive to me, an easy-to-follow guide should be sufficient. But it's a nice-to-have.

@yoland68
Copy link
Member

@christian-byrne christian-byrne added the pending When the RFC is still in comments phase label Jan 14, 2025
@huchenlei huchenlei changed the title rfc-01: object_info (v2) RFC: object_info (v2) Jan 16, 2025
@christian-byrne
Copy link
Contributor

christian-byrne commented Feb 2, 2025

2.3 Caching Strategy

  • File listings will be cached server-side with a configurable TTL
  • Cache invalidation on file system changes
  • Client-side caching headers for file listings
CACHE_TTL = 300  # 5 minutes

class FileListCache:
    def __init__(self):
        self.cache = {}
        self.last_update = {}

    async def get_files(self, folder, filter_ext=None):
        now = time.time()
        if folder in self.cache:
            if now - self.last_update[folder] < CACHE_TTL:
                return self.cache[folder]

        files = folder_paths.get_filename_list(folder)
        self.cache[folder] = files
        self.last_update[folder] = now
        return files

I propose we handle the lazy initialization of combos at the widget level rather than with any server-side caching.

The approach would be:

  1. Add lazy option to COMBO widgets
  2. Use a Proxy on lazy widget's values property that computes/fetches on first access
  3. Share computation results between widgets using a memoization key based on the underlying data source (e.g. folder path + extensions for file listings)

This gives us:

  • Initialization only when the values are actually needed
  • Automatic sharing of values between widgets accessing the same data
  • Fine-grained control over when to refresh (via forceUpdate(), dependency list, or similar)
  • No need for complex server-side caching or file system watchers. Although, folder_paths.get_filename_list is already cached based on folder mtime.
  • More extendable system for other lazy widgets in future
  • Very few requests or file system traversals by default
  • Still benefit from object_info payload size reduction

The main tradeoff is that, by default, combo values will become stale (current behavior). If real-time file system sync is needed for specific widgets, dependencies can be added or forceUpdate can be called (avoiding entire object info request).

Example implementation:

  // widgets.ts

  COMBO(node, inputName, inputData: InputSpec, app, widgetName, options = {}) {
    // ...

    if (options.isLazy) {
      const widgetStore = useWidgetStore()
    
      // Create memoization cache key that can be shared with widgets using same (structurally) getter
      // E.g., two widgets on distinct nodes that both need the list of loras.
      const widgetKey = `${inputName}/${JSON.stringify(options)}`

      // Register the getter with the store
      widgetStore.registerGetter(widgetKey, options.getter)

      const origOptions = res.widget.options
      res.widget.options = new Proxy(origOptions, {
        get(target, prop) {
          if (prop !== 'values') return target[prop]

          // widgetStore.getValues will call the getter associated with the widget's key.
          // If any dependencies have changed since last access, the value will be re-evaluated.
          return widgetStore.getValues(widgetKey)
        }
      })

      // This can be called via `app.graph.nodes` reference
      res.widget.forceUpdate = () => {
        // Value is deleted which forces re-evaluation on next access
        widgetStore.clearCache(widgetKey)
        node.graph.setDirtyCanvas(true)
      }
    }

  FILE_COMBO(node: LGraphNode, inputName: string, inputData: InputSpec) {
    const {
      folder_path,
      filter_content_type,
      filter_extension,
      include_directories
    } = inputData[1]
    const getter = async () => {
      const res = await api.getFilepaths({
        folder_path,
        filter_content_type,
        filter_extension,
        include_directories
      })
      return res.files
    }
    return ComfyWidgets.COMBO(node, inputName, inputData, app, inputName, {
      isLazy: true,
      getter
    })
  },

Any thoughts or feedback is appreciated.

@huchenlei
Copy link
Member Author

I propose we handle the lazy initialization of combos at the widget level rather than with any server-side caching.

The approach would be:

  1. Add lazy option to COMBO widgets
  2. Use a Proxy on lazy widget's values property that computes/fetches on first access
  3. Share computation results between widgets using a memoization key based on the underlying data source (e.g. folder path + extensions for file listings)

LGTM.

Some extra random thoughts regarding the implementation:

  • We might want to add a refresh button on each FILE_COMBO widget for user to individually refresh the specific widget. However, the initial impl can just make the refresh node definition do a full refresh. The refresh button can be implemented separately.
  • We might want to rethink about the format of the FILE_COMBO, as other than folders under /model, /inputs and /outputs folder also need similar mechanism for caching purpose. We might want to implement a more general mechanism like DYNAMIC_COMBO that fetches options dynamically. Here you impl is already pretty close to that, we might just do the rename of FILE_COMBO to DYNAMIC_COMBO or just keep all in existing COMBO with extra getter param.

@christian-byrne
Copy link
Contributor

christian-byrne commented Feb 3, 2025

Thanks for the reply.

We might want to implement a more general mechanism like DYNAMIC_COMBO that fetches options dynamically. Here you impl is already pretty close to that, we might just do the rename of FILE_COMBO to DYNAMIC_COMBO or just keep all in existing COMBO with extra getter param.

I agree. We should make the getter configurable through the python input data and construct the js function inside of COMBO constructor.

The spec should be made so it is generic enough to be extended beyond internal file lists.

For example, instead of this:

class LoadLatent:
    @classmethod
    def INPUT_TYPES(s):
        return {
            "required": {
                "latent": (
                    "FILE_COMBO",
                    {"folder_path": "input", "filter_extension": [".latent"]},
                )
            },
        }

Something like this:

class LoadLatent:
    @classmethod
    def INPUT_TYPES(s):
        return {
            "required": {
                "latent": (
                    "COMBO",
                    {
                        "getter": {
                            "type": "file_list",
                            "folder_path": "input",
                            "filter_extension": [".latent"],
                        }
                    },
                )
            },
        }

Or this:

class LoadLatent:
    @classmethod
    def INPUT_TYPES(s):
        return {
            "required": {
                "latent": (
                    "COMBO",
                    {
                        "getter": {
                            "type": "route",
                            "route": "/internal/files?folder_path=input&extension=.latent",
                            "backoff": 500,
                            "TTL": 10_000 # default is infinity
                        }
                    },
                )
            },
        }

WDYT?

@webfiltered
Copy link

Thoughts:

  1. Have we considered the impact that this might have on higher-latency connections?
  2. Similar: Are all requests independent of each other, and able to be run in parallel?
  3. Is there anything that might block request batching being implemented at a later date?

@christian-byrne
Copy link
Contributor

We might want to rethink about the format of the FILE_COMBO, as other than folders under /model, /inputs and /outputs folder also need similar mechanism for caching purpose.

In this proposal, /inputs and /outputs will be traversed even less than before—after initialization, requests aren't repeated, and widgets making the same request can share a getter. Therefore, server-side caching changes might be considered as an unrelated change that can be handled in separate initiative.

However, if an alternative approach, such as a client-side TTL, is used, those changes might become relevant.

@christian-byrne
Copy link
Contributor

Have we considered the impact that this might have on higher-latency connections?

The main effect is that part of the frontend startup time will be deferred until nodes are added to the graph. One consequence of this is that graph execution must be disabled until widgets finish initializing.

For example:

  1. User opens the browser.
  2. The initial workflow lacks a LoadImage node.
  3. User opens a workflow containing a LoadImage node.
  4. Execution is blocked until the LoadImage node’s input widget loads.

@christian-byrne
Copy link
Contributor

Similar: Are all requests independent of each other, and able to be run in parallel?

Yes, the lazy widgets can use a global cache with routes as keys. The cache can be populated with the promise first then the resolved value afterwards.

On access, if the value is uninitialized or stale, the fetch is started in non-blocking manner. If the cache contains a promise, it does not repeat. The cache can also track errors and timestamps to allow for TTL field and backoff.

Is there anything that might block request batching being implemented at a later date?

Since the requests are managed in a centralized store, it should be possible to batch.

@christian-byrne
Copy link
Contributor

The lazy widgets and COMBO spec aspects of this RFC are being developed on this branch Comfy-Org/ComfyUI_frontend@main...lazy-widgets

@christian-byrne christian-byrne added active when an RFC is accepted and undergoing implementation. The feature may be shipped as experimental and removed pending When the RFC is still in comments phase labels Feb 4, 2025
@christian-byrne christian-byrne self-assigned this Feb 4, 2025
@robinjhuang
Copy link
Member

Think this can merge now since it's actively being worked on

@christian-byrne christian-byrne merged commit bbbfcb6 into main Feb 6, 2025
@christian-byrne christian-byrne deleted the rfc_01 branch February 6, 2025 00:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
active when an RFC is accepted and undergoing implementation. The feature may be shipped as experimental rfc
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants