Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorFlow Lite Inference Crash with tf.reverse(x, axis=[]) #388

Open
gaikwadrahul8 opened this issue Nov 27, 2024 · 3 comments
Open

TensorFlow Lite Inference Crash with tf.reverse(x, axis=[]) #388

gaikwadrahul8 opened this issue Nov 27, 2024 · 3 comments
Assignees
Labels
status:awaiting user response When awaiting user response status:stale type:support For use-related issues

Comments

@gaikwadrahul8
Copy link

1. System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): ubuntu 22
  • TensorFlow installation (pip package or built from source): pip install tf-nightly where python is 3.9
  • TensorFlow library (version, if pip package or github SHA, if built from source):
pip show tf-nightly                                           
Name: tf-nightly
Version: 2.16.0.dev20231221
...

2. Code

Colab link: https://colab.research.google.com/drive/1gAsclHMWEf9in0wkF-y1nIbbFrh1m11V?usp=sharing

import tensorflow as tf

tf.reverse(tf.ones((1,), dtype=tf.float32), [])  # no problem

class Foo(tf.Module):
    @tf.function(input_signature=[tf.TensorSpec(shape=[None], dtype=tf.float32)])
    def reverse(self, x):
        #    works fine if axis = [0]
        #    crashes if axis = []
        return tf.reverse(x, axis=[])

foo = Foo()
converter = tf.lite.TFLiteConverter.from_concrete_functions(
    funcs=[foo.reverse.get_concrete_function()],
    trackable_obj=foo,
)

tflite_model = converter.convert()
interpreter = tf.lite.Interpreter(model_content=tflite_model)
# crash
interpreter.get_signature_runner()(x=tf.ones((1,), dtype=tf.float32))
image

3. Failure after conversion

Converted model crash at inference and the model is fully valid.

@gaikwadrahul8
Copy link
Author

This issue originally reported by @ganler has been moved to this dedicated repository for ai-edge-torch to enhance issue tracking and prioritization. To ensure continuity, we have created this new issue on your behalf.

We appreciate your understanding and look forward to your continued involvement.

@pkgoogle
Copy link
Contributor

Hi @ganler

The original recommendation:

import torch
import torch.nn as nn
import ai_edge_torch

class Foo(nn.Module):
    def __init__(self):
        super(Foo, self).__init__()

    def forward(self, x):
        return torch.flip(x, dims=[])

foo = Foo()

sample_input = (torch.randn(1),)

# Convert the model using AI Edge Torch
edge_model = ai_edge_torch.convert(foo.eval(), sample_input)

# Export the model to TFLite format
edge_model.export('foo_model.tflite')

Should resolve this. Does this resolve your issue?

@pkgoogle pkgoogle added status:awaiting user response When awaiting user response type:support For use-related issues labels Dec 13, 2024
Copy link

Marking this issue as stale since it has been open for 7 days with no activity. This issue will be closed if no further activity occurs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status:awaiting user response When awaiting user response status:stale type:support For use-related issues
Projects
None yet
Development

No branches or pull requests

2 participants