Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sampler for Image to Video ERROR:Torch not compiled with CUDA enabled(Intel Arc XPU ) #31

Open
junwei161 opened this issue Jan 20, 2025 · 1 comment

Comments

@junwei161
Copy link

Hello, could you fix the support for Intel Arc XPU graphics cards?

Sampler for Image to Video,ERROR:

*-display
description: VGA compatible controller
product: DG2 [Arc A770]
vendor: Intel Corporation
physical id: 0
bus info: pci@0000:03:00.0
logical name: /dev/fb0
version: 08
width: 64 bits
clock: 33MHz
capabilities: pciexpress msi pm vga_controller bus_master cap_list rom fb
configuration: depth=32 driver=i915 latency=0 resolution=1920,1080
resources: iomemory:400-3ff irq:159 memory:81000000-81ffffff memory:4000000000-43ffffffff memory:82000000-821fffff

`

管理器

1

Idle❌Load Session History清除
Ruyi_I2VSampler
Torch not compiled with CUDA enabled

ComfyUI Error Report

Error Details

  • Node ID: 19
  • Node Type: Ruyi_I2VSampler
  • Exception Type: AssertionError
  • Exception Message: Torch not compiled with CUDA enabled

Stack Trace

  File "/home/junwei161/ComfyUI/execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/junwei161/ComfyUI/execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/junwei161/ComfyUI/execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "/home/junwei161/ComfyUI/execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/home/junwei161/ComfyUI/custom_nodes/Ruyi-Models/comfyui/comfyui_nodes.py", line 530, in process
    pipeline.enable_sequential_cpu_offload()

  File "/home/junwei161/ComfyUI/custom_nodes/Ruyi-Models/ruyi/pipeline/pipeline_ruyi_inpaint.py", line 230, in enable_sequential_cpu_offload
    super().enable_sequential_cpu_offload(*args, **kwargs)

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/diffusers/pipelines/pipeline_utils.py", line 1166, in enable_sequential_cpu_offload
    cpu_offload(model, device, offload_buffers=offload_buffers)

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/big_modeling.py", line 198, in cpu_offload
    attach_align_device_hook(

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 425, in attach_align_device_hook
    attach_align_device_hook(

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 425, in attach_align_device_hook
    attach_align_device_hook(

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 416, in attach_align_device_hook
    add_hook_to_module(module, hook, append=True)

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 156, in add_hook_to_module
    module = hook.init_hook(module)
             ^^^^^^^^^^^^^^^^^^^^^^

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 272, in init_hook
    set_module_tensor_to_device(module, name, self.execution_device)

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/utils/modeling.py", line 339, in set_module_tensor_to_device
    new_value = old_value.to(device)
                ^^^^^^^^^^^^^^^^^^^^

  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/torch/cuda/__init__.py", line 310, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")

System Information

  • ComfyUI Version: unknown
  • Arguments: main.py --fast --normalvram --force-upcast-attention --disable-cuda-malloc
  • OS: posix
  • Python Version: 3.12.8 | packaged by Anaconda, Inc. | (main, Dec 11 2024, 16:31:09) [GCC 11.2.0]
  • Embedded Python: false
  • PyTorch Version: 2.5.1+cxx11.abi

Devices

  • Name: xpu
    • Type: xpu
    • VRAM Total: 16225243136
    • VRAM Free: 16225243136
    • Torch VRAM Total: 0
    • Torch VRAM Free: 0

Logs

2025-01-20T23:59:08.537513 - ^2025-01-20T23:59:08.537552 - ^2025-01-20T23:59:08.537591 - ^2025-01-20T23:59:08.537635 - ^2025-01-20T23:59:08.537673 - ^2025-01-20T23:59:08.537712 - ^2025-01-20T23:59:08.537750 - ^2025-01-20T23:59:08.537789 - ^2025-01-20T23:59:08.537833 - ^2025-01-20T23:59:08.537873 - ^2025-01-20T23:59:08.537911 - ^2025-01-20T23:59:08.537950 - ^2025-01-20T23:59:08.537989 - ^2025-01-20T23:59:08.538032 - ^2025-01-20T23:59:08.538071 - ^2025-01-20T23:59:08.538109 - ^2025-01-20T23:59:08.538147 - ^2025-01-20T23:59:08.538190 - ^2025-01-20T23:59:08.538227 - ^2025-01-20T23:59:08.538264 - ^2025-01-20T23:59:08.538304 - ^2025-01-20T23:59:08.538357 - ^2025-01-20T23:59:08.538396 - ^2025-01-20T23:59:08.538434 - ^2025-01-20T23:59:08.538472 - ^2025-01-20T23:59:08.538510 - ^2025-01-20T23:59:08.538548 - ^2025-01-20T23:59:08.538593 - 
2025-01-20T23:59:08.538643 - 2025-01-20T23:59:08.538716 -   File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/asyncio/base_events.py", line 686, in run_until_complete
2025-01-20T23:59:08.539114 - 2025-01-20T23:59:08.539139 -     2025-01-20T23:59:08.539167 - return future.result()2025-01-20T23:59:08.539191 - 
2025-01-20T23:59:08.539244 - 2025-01-20T23:59:08.539267 -  2025-01-20T23:59:08.539289 -  2025-01-20T23:59:08.539306 -  2025-01-20T23:59:08.539338 -  2025-01-20T23:59:08.539356 -  2025-01-20T23:59:08.539373 -  2025-01-20T23:59:08.539390 -  2025-01-20T23:59:08.539408 -  2025-01-20T23:59:08.539425 -  2025-01-20T23:59:08.539441 -  2025-01-20T23:59:08.539457 -  2025-01-20T23:59:08.539474 - ^2025-01-20T23:59:08.539490 - ^2025-01-20T23:59:08.539511 - ^2025-01-20T23:59:08.539531 - ^2025-01-20T23:59:08.539548 - ^2025-01-20T23:59:08.539565 - ^2025-01-20T23:59:08.539582 - ^2025-01-20T23:59:08.539599 - ^2025-01-20T23:59:08.539617 - ^2025-01-20T23:59:08.539634 - ^2025-01-20T23:59:08.539651 - ^2025-01-20T23:59:08.539671 - ^2025-01-20T23:59:08.539687 - ^2025-01-20T23:59:08.539704 - ^2025-01-20T23:59:08.539721 - ^2025-01-20T23:59:08.539737 - 
2025-01-20T23:59:08.539757 - 2025-01-20T23:59:08.539777 -   File "/home/junwei161/ComfyUI/custom_nodes/ComfyUI-Manager/glob/manager_server.py", line 1434, in default_cache_update
2025-01-20T23:59:08.540078 - 2025-01-20T23:59:08.540099 -     2025-01-20T23:59:08.540123 - await asyncio.gather(a, b, c, d, e)2025-01-20T23:59:08.540147 - 
2025-01-20T23:59:08.540221 - 2025-01-20T23:59:08.540253 -   File "/home/junwei161/ComfyUI/custom_nodes/ComfyUI-Manager/glob/manager_server.py", line 1421, in get_cache
2025-01-20T23:59:08.540788 - 2025-01-20T23:59:08.540831 -     2025-01-20T23:59:08.540871 - json_obj = await manager_util.get_data(uri, True)2025-01-20T23:59:08.540906 - 
2025-01-20T23:59:08.540984 - 2025-01-20T23:59:08.541014 -  2025-01-20T23:59:08.541047 -  2025-01-20T23:59:08.541077 -  2025-01-20T23:59:08.541110 -  2025-01-20T23:59:08.541143 -  2025-01-20T23:59:08.541174 -  2025-01-20T23:59:08.541204 -  2025-01-20T23:59:08.541235 -  2025-01-20T23:59:08.541264 -  2025-01-20T23:59:08.541294 -  2025-01-20T23:59:08.541331 -  2025-01-20T23:59:08.541365 -  2025-01-20T23:59:08.541394 -  2025-01-20T23:59:08.541424 -  2025-01-20T23:59:08.541455 -  2025-01-20T23:59:08.541480 - ^2025-01-20T23:59:08.541507 - ^2025-01-20T23:59:08.541534 - ^2025-01-20T23:59:08.541557 - ^2025-01-20T23:59:08.541586 - ^2025-01-20T23:59:08.541608 - ^2025-01-20T23:59:08.541637 - ^2025-01-20T23:59:08.541658 - ^2025-01-20T23:59:08.541681 - ^2025-01-20T23:59:08.541701 - ^2025-01-20T23:59:08.541723 - ^2025-01-20T23:59:08.541748 - ^2025-01-20T23:59:08.541776 - ^2025-01-20T23:59:08.541797 - ^2025-01-20T23:59:08.541817 - ^2025-01-20T23:59:08.541839 - ^2025-01-20T23:59:08.541859 - ^2025-01-20T23:59:08.541879 - ^2025-01-20T23:59:08.541899 - ^2025-01-20T23:59:08.541921 - ^2025-01-20T23:59:08.541946 - ^2025-01-20T23:59:08.541972 - ^2025-01-20T23:59:08.541992 - ^2025-01-20T23:59:08.542013 - ^2025-01-20T23:59:08.542033 - ^2025-01-20T23:59:08.542054 - ^2025-01-20T23:59:08.542074 - ^2025-01-20T23:59:08.542094 - ^2025-01-20T23:59:08.542114 - ^2025-01-20T23:59:08.542135 - ^2025-01-20T23:59:08.542158 - ^2025-01-20T23:59:08.542181 - ^2025-01-20T23:59:08.542202 - ^2025-01-20T23:59:08.542222 - ^2025-01-20T23:59:08.542243 - ^2025-01-20T23:59:08.542306 - ^2025-01-20T23:59:08.542351 - ^2025-01-20T23:59:08.542377 - ^2025-01-20T23:59:08.542400 - 
2025-01-20T23:59:08.542431 - 2025-01-20T23:59:08.542469 -   File "/home/junwei161/ComfyUI/custom_nodes/ComfyUI-Manager/glob/manager_util.py", line 125, in get_data
2025-01-20T23:59:08.542681 - 2025-01-20T23:59:08.542723 -     2025-01-20T23:59:08.542772 - json_obj = json.loads(json_text)2025-01-20T23:59:08.542813 - 
2025-01-20T23:59:08.542921 - 2025-01-20T23:59:08.542955 -  2025-01-20T23:59:08.542994 -  2025-01-20T23:59:08.543032 -  2025-01-20T23:59:08.543069 -  2025-01-20T23:59:08.543106 -  2025-01-20T23:59:08.543141 -  2025-01-20T23:59:08.543185 -  2025-01-20T23:59:08.543219 -  2025-01-20T23:59:08.543248 -  2025-01-20T23:59:08.543275 -  2025-01-20T23:59:08.543304 -  2025-01-20T23:59:08.543347 -  2025-01-20T23:59:08.543380 -  2025-01-20T23:59:08.543409 -  2025-01-20T23:59:08.543436 -  2025-01-20T23:59:08.543463 - ^2025-01-20T23:59:08.543491 - ^2025-01-20T23:59:08.543518 - ^2025-01-20T23:59:08.543546 - ^2025-01-20T23:59:08.543580 - ^2025-01-20T23:59:08.543617 - ^2025-01-20T23:59:08.543654 - ^2025-01-20T23:59:08.543693 - ^2025-01-20T23:59:08.543730 - ^2025-01-20T23:59:08.543768 - ^2025-01-20T23:59:08.543811 - ^2025-01-20T23:59:08.543847 - ^2025-01-20T23:59:08.543887 - ^2025-01-20T23:59:08.543925 - ^2025-01-20T23:59:08.543964 - ^2025-01-20T23:59:08.544001 - ^2025-01-20T23:59:08.544040 - ^2025-01-20T23:59:08.544077 - ^2025-01-20T23:59:08.544113 - ^2025-01-20T23:59:08.544149 - ^2025-01-20T23:59:08.544185 - ^2025-01-20T23:59:08.544222 - 
2025-01-20T23:59:08.544267 - 2025-01-20T23:59:08.544313 -   File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/json/__init__.py", line 346, in loads
2025-01-20T23:59:08.544704 - 2025-01-20T23:59:08.544743 -     2025-01-20T23:59:08.544789 - return _default_decoder.decode(s)2025-01-20T23:59:08.544829 - 
2025-01-20T23:59:08.544888 - 2025-01-20T23:59:08.544907 -  2025-01-20T23:59:08.544938 -  2025-01-20T23:59:08.544966 -  2025-01-20T23:59:08.544993 -  2025-01-20T23:59:08.545020 -  2025-01-20T23:59:08.545047 -  2025-01-20T23:59:08.545075 -  2025-01-20T23:59:08.545104 -  2025-01-20T23:59:08.545134 -  2025-01-20T23:59:08.545164 -  2025-01-20T23:59:08.545195 -  2025-01-20T23:59:08.545226 - ^2025-01-20T23:59:08.545254 - ^2025-01-20T23:59:08.545281 - ^2025-01-20T23:59:08.545309 - ^2025-01-20T23:59:08.545351 - ^2025-01-20T23:59:08.545383 - ^2025-01-20T23:59:08.545412 - ^2025-01-20T23:59:08.545440 - ^2025-01-20T23:59:08.545580 - ^2025-01-20T23:59:08.545609 - ^2025-01-20T23:59:08.545635 - ^2025-01-20T23:59:08.545664 - ^2025-01-20T23:59:08.545694 - ^2025-01-20T23:59:08.545722 - ^2025-01-20T23:59:08.545751 - ^2025-01-20T23:59:08.545781 - ^2025-01-20T23:59:08.545809 - ^2025-01-20T23:59:08.545840 - ^2025-01-20T23:59:08.545869 - ^2025-01-20T23:59:08.545897 - ^2025-01-20T23:59:08.545924 - ^2025-01-20T23:59:08.545952 - ^2025-01-20T23:59:08.545978 - ^2025-01-20T23:59:08.546008 - ^2025-01-20T23:59:08.546036 - ^2025-01-20T23:59:08.546064 - ^2025-01-20T23:59:08.546090 - 
2025-01-20T23:59:08.546123 - 2025-01-20T23:59:08.546159 -   File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/json/decoder.py", line 338, in decode
2025-01-20T23:59:08.546360 - 2025-01-20T23:59:08.546383 -     2025-01-20T23:59:08.546405 - obj, end = self.raw_decode(s, idx=_w(s, 0).end())2025-01-20T23:59:08.546423 - 
2025-01-20T23:59:08.546504 - 2025-01-20T23:59:08.546519 -  2025-01-20T23:59:08.546536 -  2025-01-20T23:59:08.546553 -  2025-01-20T23:59:08.546570 -  2025-01-20T23:59:08.546587 -  2025-01-20T23:59:08.546604 -  2025-01-20T23:59:08.546621 -  2025-01-20T23:59:08.546637 -  2025-01-20T23:59:08.546654 -  2025-01-20T23:59:08.546670 -  2025-01-20T23:59:08.546687 -  2025-01-20T23:59:08.546703 -  2025-01-20T23:59:08.546722 -  2025-01-20T23:59:08.546739 -  2025-01-20T23:59:08.546755 -  2025-01-20T23:59:08.546772 - ^2025-01-20T23:59:08.546789 - ^2025-01-20T23:59:08.546806 - ^2025-01-20T23:59:08.546823 - ^2025-01-20T23:59:08.546840 - ^2025-01-20T23:59:08.546856 - ^2025-01-20T23:59:08.546872 - ^2025-01-20T23:59:08.546889 - ^2025-01-20T23:59:08.546905 - ^2025-01-20T23:59:08.546922 - ^2025-01-20T23:59:08.546940 - ^2025-01-20T23:59:08.546958 - ^2025-01-20T23:59:08.546975 - ^2025-01-20T23:59:08.546992 - ^2025-01-20T23:59:08.547008 - ^2025-01-20T23:59:08.547025 - ^2025-01-20T23:59:08.547042 - ^2025-01-20T23:59:08.547059 - ^2025-01-20T23:59:08.547076 - ^2025-01-20T23:59:08.547093 - ^2025-01-20T23:59:08.547110 - ^2025-01-20T23:59:08.547128 - ^2025-01-20T23:59:08.547144 - ^2025-01-20T23:59:08.547161 - ^2025-01-20T23:59:08.547178 - ^2025-01-20T23:59:08.547195 - ^2025-01-20T23:59:08.547212 - ^2025-01-20T23:59:08.547228 - ^2025-01-20T23:59:08.547245 - ^2025-01-20T23:59:08.547261 - ^2025-01-20T23:59:08.547278 - ^2025-01-20T23:59:08.547295 - ^2025-01-20T23:59:08.547311 - ^2025-01-20T23:59:08.547338 - ^2025-01-20T23:59:08.547355 - ^2025-01-20T23:59:08.547372 - ^2025-01-20T23:59:08.547390 - ^2025-01-20T23:59:08.547406 - ^2025-01-20T23:59:08.547424 - 
2025-01-20T23:59:08.547441 - 2025-01-20T23:59:08.547459 -   File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/json/decoder.py", line 356, in raw_decode
2025-01-20T23:59:08.547566 - 2025-01-20T23:59:08.547581 -     2025-01-20T23:59:08.547600 - raise JSONDecodeError("Expecting value", s, err.value) from None2025-01-20T23:59:08.547615 - 
2025-01-20T23:59:08.547658 - 2025-01-20T23:59:08.547673 - json.decoder2025-01-20T23:59:08.547689 - .2025-01-20T23:59:08.547706 - JSONDecodeError2025-01-20T23:59:08.547721 - : 2025-01-20T23:59:08.547737 - Expecting value: line 1 column 1 (char 0)2025-01-20T23:59:08.547753 - 
2025-01-21T00:00:04.624005 - got prompt
2025-01-21T00:01:15.157275 - 
Loading pipeline components...: 0it [00:00, ?it/s]2025-01-21T00:01:15.158462 - 
Loading pipeline components...: 0it [00:00, ?it/s]2025-01-21T00:01:15.158673 - 
2025-01-21T00:01:17.051550 - !!! Exception during processing !!! Torch not compiled with CUDA enabled
2025-01-21T00:01:17.056578 - Traceback (most recent call last):
  File "/home/junwei161/ComfyUI/execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/junwei161/ComfyUI/execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/junwei161/ComfyUI/execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "/home/junwei161/ComfyUI/execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/junwei161/ComfyUI/custom_nodes/Ruyi-Models/comfyui/comfyui_nodes.py", line 530, in process
    pipeline.enable_sequential_cpu_offload()
  File "/home/junwei161/ComfyUI/custom_nodes/Ruyi-Models/ruyi/pipeline/pipeline_ruyi_inpaint.py", line 230, in enable_sequential_cpu_offload
    super().enable_sequential_cpu_offload(*args, **kwargs)
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/diffusers/pipelines/pipeline_utils.py", line 1166, in enable_sequential_cpu_offload
    cpu_offload(model, device, offload_buffers=offload_buffers)
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/big_modeling.py", line 198, in cpu_offload
    attach_align_device_hook(
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 425, in attach_align_device_hook
    attach_align_device_hook(
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 425, in attach_align_device_hook
    attach_align_device_hook(
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 416, in attach_align_device_hook
    add_hook_to_module(module, hook, append=True)
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 156, in add_hook_to_module
    module = hook.init_hook(module)
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/hooks.py", line 272, in init_hook
    set_module_tensor_to_device(module, name, self.execution_device)
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/accelerate/utils/modeling.py", line 339, in set_module_tensor_to_device
    new_value = old_value.to(device)
                ^^^^^^^^^^^^^^^^^^^^
  File "/root/miniconda3/envs/arc31208251_env/lib/python3.12/site-packages/torch/cuda/__init__.py", line 310, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

2025-01-21T00:01:17.057387 - Prompt executed in 72.43 seconds
2025-01-21T00:01:17.224551 - Failed to get ComfyUI version: Command '['git', 'describe', '--tags']' returned non-zero exit status 128.

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":33,"last_link_id":28,"nodes":[{"id":16,"type":"VHS_VideoCombine","pos":[1009.7774658203125,126.84111022949219],"size":[317,334],"flags":{},"order":6,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":13,"label":"图像"},{"name":"audio","type":"AUDIO","link":null,"shape":7,"label":"音频"},{"name":"meta_batch","type":"VHS_BatchManager","link":null,"shape":7,"label":"批次管理"},{"name":"vae","type":"VAE","link":null,"shape":7}],"outputs":[{"name":"Filenames","type":"VHS_FILENAMES","links":null,"label":"文件名"}],"properties":{"Node name for S&R":"VHS_VideoCombine"},"widgets_values":{"frame_rate":24,"loop_count":0,"filename_prefix":"Ruyi-I2V-StartEndFrames","format":"video/h264-mp4","pix_fmt":"yuv420p","crf":19,"save_metadata":true,"trim_to_audio":false,"pingpong":false,"save_output":true,"videopreview":{"hidden":false,"paused":false,"params":{"filename":"Ruyi-I2V-StartEndFrames_00001.mp4","subfolder":"","type":"output","format":"video/h264-mp4","frame_rate":24,"workflow":"Ruyi-I2V-StartEndFrames_00001.png","fullpath":"C:\\AI-video-onekey-20250117\\ComfyUI\\output\\Ruyi-I2V-StartEndFrames_00001.mp4"},"muted":false}}},{"id":24,"type":"Ruyi_TeaCache","pos":[653.2990112304688,-92.5076675415039],"size":[315,154],"flags":{},"order":4,"mode":0,"inputs":[{"name":"ruyi_model","type":"RUYI_MODEL","link":21,"label":"ruyi_model"}],"outputs":[{"name":"ruyi_model","type":"RUYI_MODEL","links":[23],"slot_index":0,"label":"ruyi_model"}],"properties":{"Node name for S&R":"Ruyi_TeaCache"},"widgets_values":[true,0.1,3,1,true]},{"id":17,"type":"LoadImage","pos":[-114.48710632324219,439.02117919921875],"size":[240.88999938964844,419.6499938964844],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[15],"slot_index":0,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["240321232915.png","image"]},{"id":18,"type":"LoadImage","pos":[155.7351837158203,438.60760498046875],"size":[242.10000610351562,421.6499938964844],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[16],"slot_index":0,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["0324202659.png","image"]},{"id":21,"type":"VHS_VideoCombine","pos":[1968.99560546875,252.7424774169922],"size":[751.6575317382812,334],"flags":{},"order":7,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":28,"label":"图像"},{"name":"audio","type":"AUDIO","link":null,"shape":7,"label":"音频"},{"name":"meta_batch","type":"VHS_BatchManager","link":null,"shape":7,"label":"批次管理"},{"name":"vae","type":"VAE","link":null,"shape":7}],"outputs":[{"name":"Filenames","type":"VHS_FILENAMES","links":null,"label":"文件名"}],"properties":{"Node name for S&R":"VHS_VideoCombine"},"widgets_values":{"frame_rate":1,"loop_count":0,"filename_prefix":"Ruyi-I2V-StartFrame","format":"video/h264-mp4","pix_fmt":"yuv420p","crf":19,"save_metadata":true,"trim_to_audio":false,"pingpong":false,"save_output":true,"videopreview":{"hidden":false,"paused":false,"params":{"filename":"Ruyi-I2V-StartFrame_00003.mp4","subfolder":"","type":"output","format":"video/h264-mp4","frame_rate":24,"workflow":"Ruyi-I2V-StartFrame_00003.png","fullpath":"C:\\AI-video-onekey-20250117\\ComfyUI\\output\\Ruyi-I2V-StartFrame_00003.mp4"},"muted":false}}},{"id":15,"type":"Ruyi_LoadModel","pos":[-72.35037231445312,-96.2402114868164],"size":[315,154],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"ruyi_model","type":"RUYI_MODEL","links":[22],"slot_index":0,"label":"ruyi_model"}],"properties":{"Node name for S&R":"Ruyi_LoadModel"},"widgets_values":["Ruyi-Mini-7B","yes","no","none","fp8_e4m3fn"]},{"id":19,"type":"Ruyi_I2VSampler","pos":[537.3553466796875,446.34320068359375],"size":[327.5999755859375,338],"flags":{},"order":5,"mode":0,"inputs":[{"name":"ruyi_model","type":"RUYI_MODEL","link":23,"label":"ruyi_model"},{"name":"start_img","type":"IMAGE","link":15,"label":"start_img"},{"name":"end_img","type":"IMAGE","link":16,"shape":7,"label":"end_img"}],"outputs":[{"name":"images","type":"IMAGE","links":[13,28],"slot_index":0,"label":"images"}],"properties":{"Node name for S&R":"Ruyi_I2VSampler"},"widgets_values":[48,512,4971874773355,"randomize",25,7,"DDIM","auto","auto","low_memory_mode","5"]},{"id":25,"type":"Ruyi_EnhanceAVideo","pos":[303.1950988769531,-81.68051147460938],"size":[292.72698974609375,130],"flags":{},"order":3,"mode":0,"inputs":[{"name":"ruyi_model","type":"RUYI_MODEL","link":22,"label":"ruyi_model"}],"outputs":[{"name":"ruyi_model","type":"RUYI_MODEL","links":[21],"slot_index":0,"label":"ruyi_model"}],"properties":{"Node name for S&R":"Ruyi_EnhanceAVideo"},"widgets_values":[true,1,0,0]}],"links":[[13,19,0,16,0,"IMAGE"],[15,17,0,19,1,"IMAGE"],[16,18,0,19,2,"IMAGE"],[21,25,0,24,0,"RUYI_MODEL"],[22,15,0,25,0,"RUYI_MODEL"],[23,24,0,19,0,"RUYI_MODEL"],[28,19,0,21,0,"IMAGE"]],"groups":[{"id":1,"title":"B站、Youtube:T8star-Aix","bounding":[-967.9205322265625,-745.988037109375,4167.669921875,308.8702087402344],"color":"#3f789e","font_size":240,"flags":{}}],"config":{},"extra":{"ds":{"scale":0.6967877662371478,"offset":[193.06531039397726,33.60196303931741]},"node_versions":{"ComfyUI-VideoHelperSuite":"cad87a17a3ff5e03c26cf55e4dc90397b5642503","Ruyi-Models":"f6543017c973c5150f3a9072a43a4ca690fb307e","comfy-core":"unknown"},"VHS_latentpreview":false,"VHS_latentpreviewrate":0,"ue_links":[]},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)
`

@cellzero
Copy link
Collaborator

I think this issue arises because the code defaults to using CUDA devices.

Since I do not have access to an XPU device, it is difficult for me to replicate this scenario. Could you help us modify the code and test its feasibility on XPU? If you can help with the modifications, I think we need to:

  1. Pass the device (might be this or something should be used for XPU) to the enable_sequential_cpu_offload method;
  2. Change the "cuda" string to the device in pipeline_ruyi_inpaint (and possibly in other code files as well).

Let's start with these adjustments to see if they enable running on XPU. Looking forward to your response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants