Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add webnn-gpu and webnn-npu support for MODNet #49

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

ibelem
Copy link

@ibelem ibelem commented Mar 28, 2025

Add webnn-gpu and webnn-npu support for MODNet with following update:

  1. Update @huggingface/transformers to 3.4.1 which includes onnxruntime-web 1.22.0-dev.20250306-ccf8fdd9ea. This ORT Web version includes the fix of allow ops to handle ignoring an empty tensor as input which can handle roi inputs with 0 dim for Resize nodes.
    • In version 11, the 1-D roi was required, starting from v13, the roi became optional.
    • Some ONNX conversion tools emitted an empty roi tensor when they should have ellided it completely
  2. Keep the WebGPU code path
  3. Add webnn-gpu and webnn-npu links and use fp16 MODNet
  4. Add freeDimensionOverrides for WebNN path only
  5. After setting freeDimensionOverrides, disable sizeSlider for WebNN path
  6. Please note that the WebNN NPU for MODNet will work for next generation WebNN in Chromium based browsers, not for now.

@xenova PTAL

CC @huningxin @Honry

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant