Skip to content

Various crashes/assertion failures when trying to import quantized models #3236

@nuudlman

Description

@nuudlman

I'm using the docker image, this issue is persistent over quite a few image versions (409ad8ba2be3d01e9e9e230b20ff368f1372d808c618b5fb174c5262c3f42e83, b33d94ef096d0a314a6a5f9417683a789e4abf3eb2e2190f60aa6607594ca0d6, and ones from a few weeks ago)

When I try to import the official MNIST int8 example (mnist-12-int8.onnx from https://github.com/onnx/models/tree/main/validated/vision/classification/mnist) I don't actually get an error message, just an internal error not ranked, which seems to come from the verifier:

If I follow the tensorflow quantization example (https://www.tensorflow.org/model_optimization/guide/quantization/training_example), and then use tf2onnx to convert the tflite into an onnx file I then get the following assertion failure:

onnx-mlir: /workdir/onnx-mlir/src/Conversion/ONNXToKrnl/Math/Elementwise.cpp:2431: auto onnx_mlir::ONNXElementwiseUnaryOpLowering<mlir::ONNXDequantizeLinearOp>::matchAndRewrite(mlir::ONNXDequantizeLinearOp, OpAdaptor, ConversionPatternRewriter &)::(anonymous class)::operator()(const KrnlBuilder &, ValueRange) const [ElementwiseUnaryOp = mlir::ONNXDequantizeLinearOp]: Assertion `isScalarValue(operands[i]) && "unary expected scalar additional values"' failed.

The file is attached for here for convenience.
mnist_tflite.onnx.gz

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions