You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
Linux Debian 11
TensorFlow installed from (source or binary):
Compiled from source
TensorFlow version (or github SHA if from source):
744dad26ef526690319042030f776e6f7e62dbc8
Standalone code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate
the problem. If possible, please share a link to Colab/Jupyter/any notebook.
import numpy as np
# Define and create the model
model = tf.keras.models.Sequential([
tf.keras.layers.Input(shape=(3, 5), name='input'),
tf.keras.layers.LSTM(10, time_major=False, return_sequences=True)
])
model.compile(optimizer='adam',
loss='mean_squared_error',
metrics=['accuracy'])
run_model = tf.function(lambda x: model(x))
concrete_func = run_model.get_concrete_function(
tf.TensorSpec([batchSize, sequenceLength, numFeatures], model.inputs[0].dtype))
print("hidden_states: ", model.layers[0].states[0])
print("cell_states: ", model.layers[0].states[1])
# model directory.
MODEL_DIR = "/tmp/lstmNet"
model.save(MODEL_DIR, save_format="tf", signatures=concrete_func)
converter = tf.lite.TFLiteConverter.from_saved_model(MODEL_DIR)
tflite_model = converter.convert()
# Save the TF Lite model.
with tf.io.gfile.GFile('/tmp/lstmNet.tflite', 'wb') as f:
f.write(tflite_model)
Any other info / logs
Using flatbuffer_translate to convert the generated TFLite model to MLIR produces:
Given that the model.layers[0].states[1] is none the InputCellState value for the tfl.unidirectional_sequence_lstm op which is %12 should have been all zeros, but it is all ones. Note that this value is interpreted as all zeros when using the tf_tfl_translate command :
Here %cst_10 which is the InputCellState value is all zeros. So the error is coming from the TFLite file to MLIR conversion when invoked via flatbuffer_translate command.
Include any logs or source code that would be helpful to diagnose the problem.
If including tracebacks, please include the full traceback. Large logs and files
should be attached.
The text was updated successfully, but these errors were encountered:
This issue originally reported by @sahas3 has been moved to this dedicated repository for ai-edge-torch to enhance issue tracking and prioritization. To ensure continuity, we have created this new issue on your behalf.
We appreciate your understanding and look forward to your continued involvement.
System information
Linux Debian 11
Compiled from source
744dad26ef526690319042030f776e6f7e62dbc8
Standalone code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate
the problem. If possible, please share a link to Colab/Jupyter/any notebook.
Any other info / logs
Using
flatbuffer_translate
to convert the generatedTFLite
model toMLIR
produces:Given that the
model.layers[0].states[1]
isnone
theInputCellState
value for thetfl.unidirectional_sequence_lstm
op which is%12
should have been all zeros, but it is all ones. Note that this value is interpreted as all zeros when using thetf_tfl_translate
command :Here
%cst_10
which is theInputCellState
value is all zeros. So the error is coming from the TFLite file to MLIR conversion when invoked viaflatbuffer_translate
command.Include any logs or source code that would be helpful to diagnose the problem.
If including tracebacks, please include the full traceback. Large logs and files
should be attached.
The text was updated successfully, but these errors were encountered: