-
Notifications
You must be signed in to change notification settings - Fork 367
Description
I'm using the docker image, this issue is persistent over quite a few image versions (409ad8ba2be3d01e9e9e230b20ff368f1372d808c618b5fb174c5262c3f42e83, b33d94ef096d0a314a6a5f9417683a789e4abf3eb2e2190f60aa6607594ca0d6, and ones from a few weeks ago)
When I try to import the official MNIST int8 example (mnist-12-int8.onnx from https://github.com/onnx/models/tree/main/validated/vision/classification/mnist) I don't actually get an error message, just an internal error not ranked
, which seems to come from the verifier:
op.emitError("not ranked"); |
If I follow the tensorflow quantization example (https://www.tensorflow.org/model_optimization/guide/quantization/training_example), and then use tf2onnx to convert the tflite into an onnx file I then get the following assertion failure:
onnx-mlir: /workdir/onnx-mlir/src/Conversion/ONNXToKrnl/Math/Elementwise.cpp:2431: auto onnx_mlir::ONNXElementwiseUnaryOpLowering<mlir::ONNXDequantizeLinearOp>::matchAndRewrite(mlir::ONNXDequantizeLinearOp, OpAdaptor, ConversionPatternRewriter &)::(anonymous class)::operator()(const KrnlBuilder &, ValueRange) const [ElementwiseUnaryOp = mlir::ONNXDequantizeLinearOp]: Assertion `isScalarValue(operands[i]) && "unary expected scalar additional values"' failed.
The file is attached for here for convenience.
mnist_tflite.onnx.gz