Skip to content

ReflectionPad3D fails at runtime #2576

@metascroy

Description

@metascroy

🐞Describing the bug

torch.ops.aten.pad.default fails at runtime when in reflect mode, but ideally this would happen ahead of time.

Stack Trace

/opt/miniconda3/envs/op-et/lib/python3.10/site-packages/coremltools/models/model.py:560: RuntimeWarning: You will not be able to run predict() on this Core ML model. Underlying exception message was: Error compiling model: "Failed to parse the model specification. Error: Unable to parse ML Program: in operation pad_cast_fp16: Padding for more than two dimensions only supports `constant` mode".
  _warnings.warn(
Traceback (most recent call last):
  File "/Users/scroy/Desktop/executorch/test.py", line 160, in <module>
    out = mlmodel.predict(predict_inputs)
  File "/opt/miniconda3/envs/op-et/lib/python3.10/site-packages/coremltools/models/model.py", line 804, in predict
    raise self._framework_error
  File "/opt/miniconda3/envs/op-et/lib/python3.10/site-packages/coremltools/models/model.py", line 549, in _get_proxy_and_spec
    _MLModelProxy(
RuntimeError: Error compiling model: "Failed to parse the model specification. Error: Unable to parse ML Program: in operation pad_cast_fp16: Padding for more than two dimensions only supports `constant` mode".

To Reproduce

import torch

class Model(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.pad = torch.nn.ReflectionPad3d(padding=2)
        
    def forward(self, x):
        return self.pad(x)

model = Model()
inputs = (
    torch.randn(1, 6, 6, 6, 6),
)

eager_outputs = model(*inputs)
#print(f"Eager: {eager_outputs.shape} {eager_outputs}")

ep = torch.export.export(model.eval(), inputs)
print(ep)


import coremltools as ct
import numpy as np
ep = ep.run_decompositions({})

eager_outputs = model(*inputs)

mlmodel = ct.convert(ep)

coreml_inputs = mlmodel.get_spec().description.input
coreml_outputs = mlmodel.get_spec().description.output
predict_inputs = {str(ct_in.name): pt_in.detach().cpu().numpy().astype(np.int32) for ct_in, pt_in in zip(coreml_inputs, inputs)}
out = mlmodel.predict(predict_inputs)

print("Eager", eager_outputs)
print("CoremL", out)

System environment (please complete the following information):

  • coremltools version: 8.3
  • OS (e.g. MacOS version or Linux type): macOS15

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugUnexpected behaviour that should be corrected (type)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions