Skip to content

[Bug] Resize N-D import failure: TVM only supports 4D resize2d, but ONNX Resize supports N-D tensors #18608

@dutZ1855

Description

@dutZ1855

Expected behavior

TVM should be able to import and compile an ONNX Resize model with non-4D input tensors (e.g. 3D), matching ONNX Runtime behavior (and also OpenVINO, which can run this model).

Per the ONNX Resize operator spec, Resize supports resizing N-D tensors (not restricted to 4D):

Actual behavior

For the following model,

Image Image

When importing the attached model with TVM Relax (tvm.relax.frontend.onnx.from_onnx), TVM fails during ONNX conversion with:

AssertionError: Only resize2d is currently supported.

The failure is from TVM's ONNX Resize converter which asserts ndims == 4.

Error converting operator Resize, with inputs: [x, None, None, sizes]
Traceback (most recent call last):
  File "DLCompilers/bug/tvm/resize_only_resize2d/run_repro.py", line 66, in <module>
    test(onnx_model)  
    ^^^^^^^^^^^^^^^^
  File "DLCompilers/bug/tvm/resize_only_resize2d/run_repro.py", line 51, in test
    tvm_model = from_onnx(model, opset=18, keep_params_in_input=True)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 4260, in from_onnx
    return g.from_onnx(graph, opset)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3890, in from_onnx
    self._construct_nodes(graph)
  File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 4071, in _construct_nodes
    raise err
  File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 4066, in _construct_nodes
    op = self._convert_operator(op_name, inputs, attr, self.opset)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 4166, in _convert_operator
    sym = op_function(self.bb, inputs, attrs, [self._nodes, self._params])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "DLCompilers/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 2221, in _impl_v18
    assert ndims == 4, "Only resize2d is currently supported."
           ^^^^^^^^^^
AssertionError: Only resize2d is currently supported.

ONNX Runtime can execute the same model successfully.

ORT run finished
[[[0.1257]]]

Environment

Operating System:Ubuntu 22.04.4 LTS
TVM version:0.23.0dev
pytorch version:2.9.1
ort version:1.23.2
onnx version: 1.20.0
python:3.11.14

Steps to reproduce

Download the model and run the following code to obtain the results.

model.zip

from typing import Dict, List, Literal, Optional
import sys
import os
from pathlib import Path

import numpy as np
import onnx
import onnxruntime
from onnx import ModelProto, TensorProto, helper

_REPO_ROOT = Path(__file__).resolve().parents[3]
_TVM_PYTHON = _REPO_ROOT / "tvm" / "python"
_TVM_BUILD = _REPO_ROOT / "tvm" / "build"
if _TVM_PYTHON.exists():
    sys.path.insert(0, _TVM_PYTHON.as_posix())
os.environ.setdefault("TVM_LIBRARY_PATH", _TVM_BUILD.as_posix())

import tvm
import tvm.testing
from tvm import relax
from tvm.relax.frontend.onnx import from_onnx

import argparse
import pickle

def test(model: ModelProto,) -> None:
    model.ir_version = 8
    model.opset_import[0].version = 18
 
    with open("oracle.pkl", 'rb') as fp:
        inputs = pickle.load(fp)
    try:
        ort_session = onnxruntime.InferenceSession(
            model.SerializeToString(), providers=["CPUExecutionProvider"]
        )
        feed = inputs.get("input", inputs) if isinstance(inputs, dict) else inputs
        ort_output = ort_session.run([], feed)
        print("ORT run finished")
        for idx, tensor in enumerate(ort_output):
            print(tensor)
            print("-" * 40)
    except Exception as e:
        print("This model cannot be executed by onnxruntime!")
        sys.exit(1)

    tvm_model = from_onnx(model, opset=18, keep_params_in_input=True)
    tvm_model = relax.transform.DecomposeOpsForInference()(tvm_model)
    tvm_model = relax.transform.LegalizeOps()(tvm_model)

    tvm_model, params = relax.frontend.detach_params(tvm_model)
    with tvm.transform.PassContext(opt_level=3):
        ex = tvm.compile(tvm_model, target="llvm")
   
if __name__ == "__main__":
    
    onnx_model = onnx.load("model.onnx")
    test(onnx_model)  

Triage

  • needs-triage

Metadata

Metadata

Assignees

No one assigned

    Labels

    needs-triagePRs or issues that need to be investigated by maintainers to find the right assignees to address ittype: bug

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions