Skip to content

Commit 528cb73

Browse files
Catherine Balajadiafacebook-github-bot
Catherine Balajadia
authored andcommitted
[3PL Remediation][Thorough Review Needed][Transformers] upgrade fblearner/flow/projects move from transformers 3.14.x to 4.4x
Summary: The 3P Library Vulnerability Remediation Team is dedicated to remediating high-risk external libraries at Meta using both manual and automated processes. Older versions of this library have been identified as risky, and this diff stack is intended to upgrade the library to a recommended version. ----------- We kindly request your help with the diff review. Please commandeer this diff stack if specific merges need to be added or if there are any build or dependency failures. ----------- General change: 1. Update library reference to old library to newer version in bzl 2. Update import reference to old library to the newer version in actual codes ______________________________ TPMS: https://fburl.com/third_party_metadata/e12wxl9w Vulnerability Information: CVE-2023-2800 ( cvss3=4.7 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2023-2800 CVE-2023-6730 ( cvss3=8.8 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2023-6730 CVE-2023-7018 ( cvss3=7.8 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2023-7018 CVE-2024-11392 ( cvss3=8.8 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2024-11392 CVE-2024-11393 ( cvss3=8.8 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2024-11393 CVE-2024-11394 ( cvss3=8.8 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2024-11394 CVE-2024-3568 ( cvss3=3.4 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/CVE-2024-3568 SNYK-PYTHON-TRANSFORMERS-3092483 ( cvss3=5.4 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/SNYK-PYTHON-TRANSFORMERS-3092483 SNYK-PYTHON-TRANSFORMERS-6220003 ( cvss3=6.5 ) https://www.internalfb.com/intern/vulnerability_management/vulnerabilities/SNYK-PYTHON-TRANSFORMERS-6220003 Reviewed By: diliop, ebsmothers Differential Revision: D71553197 fbshipit-source-id: 582019e390a961254b72fbd806d55eab69a2a16e
1 parent b9e217d commit 528cb73

File tree

1 file changed

+19
-41
lines changed

1 file changed

+19
-41
lines changed

mmf/modules/hf_layers.py

Lines changed: 19 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -7,47 +7,25 @@
77
from mmf.utils.patch import restore_saved_modules, safecopy_modules
88
from torch import nn, Tensor
99

10-
try:
11-
from transformers3.modeling_bert import (
12-
BertAttention,
13-
BertEmbeddings,
14-
BertEncoder,
15-
BertLayer,
16-
BertModel,
17-
BertPooler,
18-
BertSelfAttention,
19-
BertSelfOutput,
20-
)
21-
from transformers3.modeling_roberta import (
22-
RobertaAttention,
23-
RobertaEmbeddings,
24-
RobertaEncoder,
25-
RobertaLayer,
26-
RobertaModel,
27-
RobertaSelfAttention,
28-
)
29-
from transformers3.modeling_utils import PreTrainedModel
30-
except ImportError:
31-
from transformers.modeling_bert import (
32-
BertAttention,
33-
BertEmbeddings,
34-
BertEncoder,
35-
BertLayer,
36-
BertModel,
37-
BertPooler,
38-
BertSelfAttention,
39-
BertSelfOutput,
40-
)
41-
from transformers.modeling_roberta import (
42-
RobertaAttention,
43-
RobertaEmbeddings,
44-
RobertaEncoder,
45-
RobertaLayer,
46-
RobertaModel,
47-
RobertaSelfAttention,
48-
)
49-
from transformers.modeling_utils import PreTrainedModel
50-
10+
from transformers.modeling_utils import PreTrainedModel
11+
from transformers.models.bert.modeling_bert import (
12+
BertAttention,
13+
BertEmbeddings,
14+
BertEncoder,
15+
BertLayer,
16+
BertModel,
17+
BertPooler,
18+
BertSelfAttention,
19+
BertSelfOutput,
20+
)
21+
from transformers.models.roberta.modeling_roberta import (
22+
RobertaAttention,
23+
RobertaEmbeddings,
24+
RobertaEncoder,
25+
RobertaLayer,
26+
RobertaModel,
27+
RobertaSelfAttention,
28+
)
5129

5230
patch_functions = [
5331
"BertEmbeddings.forward",

0 commit comments

Comments
 (0)