Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

modular_model_converter can not handle objects import via try - except #35414

Open
2 of 4 tasks
HIT-cwh opened this issue Dec 25, 2024 · 0 comments
Open
2 of 4 tasks

modular_model_converter can not handle objects import via try - except #35414

HIT-cwh opened this issue Dec 25, 2024 · 0 comments
Labels

Comments

@HIT-cwh
Copy link

HIT-cwh commented Dec 25, 2024

System Info

transformers 4.48.0.dev0 d8c1db2

Who can help?

@ArthurZucker

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

How to reproduce?

  1. Clone the Transformers repository and check out the specified commit:

git clone [email protected]:huggingface/transformers.git && cd transformers && git checkout d8c1db2f568d4bcc254bc046036acf0d6bba8373

  1. Create a new folder named xxx_model in src/transformers/models/

  2. Inside this folder, create a new Python file called modular_xxx.py with the following content:

import torch
import torch.nn as nn
try:
    import torch.nn.functional as F
except:
    pass

from ..llama.modeling_llama import (
    LlamaMLP,
)

class Model(nn.Module):
    def forward(self, x, w):
        return F.linear(x, w)
  1. Run the following command to execute the model converter:
    python utils/modular_model_converter.py --files_to_parse src/transformers/models/xxx_model/modular_xxx.py

This will generate the modeling file at: src/transformers/models/xxx_model/modeling_xxx.py.

Expected behavior

Expected vs Actual Contents in src/transformers/models/xxx_model/modeling_xxx.py

The expected contents in src/transformers/models/xxx_model/modeling_xxx.py is :

#                🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
#           This file was automatically generated from src/transformers/models/xxx_model/modular_xxx.py.
#               Do NOT edit this file manually as any edits will be overwritten by the generation of
#             the file from the modular. If any change should be done, please apply the change to the
#                          modular_xxx.py file directly. One of our CI enforces this.
#                🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
import torch.nn as nn

try:
    import torch.nn.functional as F
except:
    pass

class Model(nn.Module):
    def forward(self, x, w):
        return F.linear(x, w)

However, the actual content generated in src/transformers/models/xxx_model/modeling_xxx.py is :

#                🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
#           This file was automatically generated from src/transformers/models/xxx_model/modular_xxx.py.
#               Do NOT edit this file manually as any edits will be overwritten by the generation of
#             the file from the modular. If any change should be done, please apply the change to the
#                          modular_xxx.py file directly. One of our CI enforces this.
#                🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
import torch.nn as nn


class Model(nn.Module):
    def forward(self, x, w):
        return F.linear(x, w)

Issue

The lines try: import torch.nn.functional as F except: pass are missing in the actual content, even though it exists in the original modular file.

@HIT-cwh HIT-cwh added the bug label Dec 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant