Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much RAM do you need? #43

Open
notBradPitt opened this issue Jun 17, 2024 · 2 comments
Open

How much RAM do you need? #43

notBradPitt opened this issue Jun 17, 2024 · 2 comments

Comments

@notBradPitt
Copy link

How much RAM do I need to run this node? I'm using 16GB RAM and 16GB VRAM and still couldn't get a result.

RAM usage reaches 96% before ComfyUI crashes at the last stage (muse_pose). Even with height and width set to 9x16 (literally nine and sixteen) and it still doesn't work. Why does it use more RAM that it does VRAM? Did I set it up incorrectly?

@jhj0517
Copy link
Contributor

jhj0517 commented Jun 18, 2024

In my test, it required ~ 17GB of VRAM with fp16 dtype and 512x512 resolution.

If you've failed with 9x16, then something is wrong. I guess you're facing a different error, not a CUDA error.

Why does it use more RAM that it does VRAM?

You should check if you are enabled to use CUDA.

import torch
print(torch.cuda.is_available())

@sunsetleoli
Copy link

same issue, I'm using 16GB VRAM and want to product a video with 360x640 resolution and 30 fps,get OOM error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants