Skip to content

Commit a396f43

Browse files
SohamPrabhuYour NameYour Namestevhliu
authored
Update roc bert docs (#38835)
* Moved the sources to the right * small Changes * Some Changes to moonshine * Added the install to pipline * updated the monshine model card * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Updated Documentation According to changes * Fixed the model with the commits * Changes to the roc_bert * Final Update to the branch * Adds Quantizaiton to the model * Finsihed Fixing the Roc_bert docs * Fixed Moshi * Fixed Problems * Fixed Problems * Fixed Problems * Fixed Problems * Fixed Problems * Fixed Problems * Added the install to pipline * updated the monshine model card * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Update docs/source/en/model_doc/moonshine.md Co-authored-by: Steven Liu <[email protected]> * Updated Documentation According to changes * Fixed the model with the commits * Fixed the problems * Final Fix * Final Fix * Final Fix * Update roc_bert.md --------- Co-authored-by: Your Name <[email protected]> Co-authored-by: Your Name <[email protected]> Co-authored-by: Steven Liu <[email protected]>
1 parent 3ae52cc commit a396f43

File tree

1 file changed

+63
-24
lines changed

1 file changed

+63
-24
lines changed

docs/source/en/model_doc/roc_bert.md

Lines changed: 63 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -14,39 +14,78 @@ rendered properly in your Markdown viewer.
1414
1515
-->
1616

17+
<div style="float: right;">
18+
<div class="flex flex-wrap space-x-1">
19+
<img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-DE3412?style=flat&logo=pytorch&logoColor=white">
20+
</div>
21+
</div>
22+
1723
# RoCBert
1824

19-
<div class="flex flex-wrap space-x-1">
20-
<img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-DE3412?style=flat&logo=pytorch&logoColor=white">
21-
</div>
25+
[RoCBert](https://aclanthology.org/2022.acl-long.65.pdf) is a pretrained Chinese [BERT](./bert) model designed against adversarial attacks like typos and synonyms. It is pretrained with a contrastive learning objective to align normal and adversarial text examples. The examples include different semantic, phonetic, and visual features of Chinese. This makes RoCBert more robust against manipulation.
26+
27+
You can find all the original RoCBert checkpoints under the [weiweishi](https://huggingface.co/weiweishi) profile.
28+
29+
> [!TIP]
30+
> This model was contributed by [weiweishi](https://huggingface.co/weiweishi).
31+
>
32+
> Click on the RoCBert models in the right sidebar for more examples of how to apply RoCBert to different Chinese language tasks.
33+
34+
The example below demonstrates how to predict the [MASK] token with [`Pipeline`], [`AutoModel`], and from the command line.
35+
36+
<hfoptions id="usage">
37+
<hfoption id="Pipeline">
38+
39+
```py
40+
import torch
41+
from transformers import pipeline
42+
43+
pipeline = pipeline(
44+
task="fill-mask",
45+
model="weiweishi/roc-bert-base-zh",
46+
torch_dtype=torch.float16,
47+
device=0
48+
)
49+
pipeline("這家餐廳的拉麵是我[MASK]過的最好的拉麵之")
50+
```
51+
52+
</hfoption>
53+
<hfoption id="AutoModel">
54+
55+
```py
56+
import torch
57+
from transformers import AutoModelForMaskedLM, AutoTokenizer
2258

23-
## Overview
59+
tokenizer = AutoTokenizer.from_pretrained(
60+
"weiweishi/roc-bert-base-zh",
61+
)
62+
model = AutoModelForMaskedLM.from_pretrained(
63+
"weiweishi/roc-bert-base-zh",
64+
torch_dtype=torch.float16,
65+
device_map="auto",
66+
)
67+
inputs = tokenizer("這家餐廳的拉麵是我[MASK]過的最好的拉麵之", return_tensors="pt").to("cuda")
2468

25-
The RoCBert model was proposed in [RoCBert: Robust Chinese Bert with Multimodal Contrastive Pretraining](https://aclanthology.org/2022.acl-long.65.pdf) by HuiSu, WeiweiShi, XiaoyuShen, XiaoZhou, TuoJi, JiaruiFang, JieZhou.
26-
It's a pretrained Chinese language model that is robust under various forms of adversarial attacks.
69+
with torch.no_grad():
70+
outputs = model(**inputs)
71+
predictions = outputs.logits
2772

28-
The abstract from the paper is the following:
73+
masked_index = torch.where(inputs['input_ids'] == tokenizer.mask_token_id)[1]
74+
predicted_token_id = predictions[0, masked_index].argmax(dim=-1)
75+
predicted_token = tokenizer.decode(predicted_token_id)
2976

30-
*Large-scale pretrained language models have achieved SOTA results on NLP tasks. However, they have been shown
31-
vulnerable to adversarial attacks especially for logographic languages like Chinese. In this work, we propose
32-
ROCBERT: a pretrained Chinese Bert that is robust to various forms of adversarial attacks like word perturbation,
33-
synonyms, typos, etc. It is pretrained with the contrastive learning objective which maximizes the label consistency
34-
under different synthesized adversarial examples. The model takes as input multimodal information including the
35-
semantic, phonetic and visual features. We show all these features are important to the model robustness since the
36-
attack can be performed in all the three forms. Across 5 Chinese NLU tasks, ROCBERT outperforms strong baselines under
37-
three blackbox adversarial algorithms without sacrificing the performance on clean testset. It also performs the best
38-
in the toxic content detection task under human-made attacks.*
77+
print(f"The predicted token is: {predicted_token}")
78+
```
3979

40-
This model was contributed by [weiweishi](https://huggingface.co/weiweishi).
80+
</hfoption>
81+
<hfoption id="transformers CLI">
4182

42-
## Resources
83+
```bash
84+
echo -e "這家餐廳的拉麵是我[MASK]過的最好的拉麵之" | transformers-cli run --task fill-mask --model weiweishi/roc-bert-base-zh --device 0
85+
```
4386

44-
- [Text classification task guide](../tasks/sequence_classification)
45-
- [Token classification task guide](../tasks/token_classification)
46-
- [Question answering task guide](../tasks/question_answering)
47-
- [Causal language modeling task guide](../tasks/language_modeling)
48-
- [Masked language modeling task guide](../tasks/masked_language_modeling)
49-
- [Multiple choice task guide](../tasks/multiple_choice)
87+
</hfoption>
88+
</hfoptions>
5089

5190
## RoCBertConfig
5291

0 commit comments

Comments
 (0)