You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently working on translating the Llama2 model (from transformers==4.40.2 and pytorch==2.5.1) to MLIR using torch-mlir. However, I encountered the following error:
NotImplementedError: Higher-order operation 'wrap_with_autocast' not implemented in the FxImporter (tried '_import_hop_wrap_with_autocast')
It appears that there is an unimplemented operation, wrap_with_autocast, which is found within the LlamaRotaryEmbedding class. This operation seems to be generated by the context manager with torch.autocast(device_type=device_type, enabled=False).
I would appreciate any guidance or assistance on how to resolve this problem.
Thank you!
The text was updated successfully, but these errors were encountered:
Hello,
I am currently working on translating the Llama2 model (from transformers==4.40.2 and pytorch==2.5.1) to MLIR using torch-mlir. However, I encountered the following error:
NotImplementedError: Higher-order operation 'wrap_with_autocast' not implemented in the FxImporter (tried '_import_hop_wrap_with_autocast')
It appears that there is an unimplemented operation,
wrap_with_autocast
, which is found within theLlamaRotaryEmbedding
class. This operation seems to be generated by the context manager withtorch.autocast(device_type=device_type, enabled=False)
.I would appreciate any guidance or assistance on how to resolve this problem.
Thank you!
The text was updated successfully, but these errors were encountered: