Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

”wrap_with_autocast“ is not implemented #3836

Open
Jiangggg1995 opened this issue Oct 30, 2024 · 0 comments
Open

”wrap_with_autocast“ is not implemented #3836

Jiangggg1995 opened this issue Oct 30, 2024 · 0 comments

Comments

@Jiangggg1995
Copy link

Hello,

I am currently working on translating the Llama2 model (from transformers==4.40.2 and pytorch==2.5.1) to MLIR using torch-mlir. However, I encountered the following error:

NotImplementedError: Higher-order operation 'wrap_with_autocast' not implemented in the FxImporter (tried '_import_hop_wrap_with_autocast')
It appears that there is an unimplemented operation, wrap_with_autocast, which is found within the LlamaRotaryEmbedding class. This operation seems to be generated by the context manager with torch.autocast(device_type=device_type, enabled=False).

I would appreciate any guidance or assistance on how to resolve this problem.

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant