Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WebGPU] Only OnBoard Graphic Card is taken #723

Closed
theurichde opened this issue Nov 6, 2024 · 2 comments
Closed

[WebGPU] Only OnBoard Graphic Card is taken #723

theurichde opened this issue Nov 6, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@theurichde
Copy link

Bug description

  • I run MiniSearch on my Laptop, which has one Nvidia Card and one onboard Intel graphic card.
  • When enabling WebGPU, I can not choose which card to use, which leads to the usage of the Intel card (which is slow)

Steps to reproduce

  1. Have two graphic cards available
  2. Enable WebGPU
  3. Try to choose another card

Expected behavior

I would like to choose the card MiniSearch uses for inference.

Device info

  • Laptop
  • Windows 11
  • 2 graphic cards, Intel and Nvidia 3060

Additional context

(btw: awesome project!)

@theurichde theurichde added the bug Something isn't working label Nov 6, 2024
@felladrin
Copy link
Owner

Hi, @theurichde! Thanks for the detailed report!

It seems to be related to this issue from Web-LLM:

Could you please run the two following commands on the browser console, and share the results here?

await navigator.gpu.requestAdapter()
await navigator.gpu.requestAdapter({powerPreference: 'high-performance'})

In that same thread [1], a user also posted a workaround for Windows:

You can force Chrome in windows to use the more powerful GPU by going to the Display>Graphics>Apps page, adding chrome, clicking options, and setting to use dedicated GPU.


And another workaround is to use Ollama or any other inference engine with OpenAI-Compatible API, which you know works with your dedicated GPU, and connect to it using the 'Remote Server (API)' setting in the Menu.

@theurichde
Copy link
Author

Running both commands shows only the intel card. So, it's related to the WebLLM issue you shared.
Meanwhile, I will go with the Ollama approach.

Thank you! Keep up the good work 👋🏻

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants