-
Notifications
You must be signed in to change notification settings - Fork 3.1k
[tokenizer] fix ChineseBertTokenizer with max_length/max_seq_len #8926
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tokenizer] fix ChineseBertTokenizer with max_length/max_seq_len #8926
Conversation
|
Thanks for your contribution! |
Codecov Report❌ Patch coverage is Please upload reports for the commit 4c265bb to get more accurate results.
❌ Your patch status has failed because the patch coverage (0.00%) is below the target coverage (80.00%). You can increase the patch coverage or adjust the target coverage. Additional details and impacted files@@ Coverage Diff @@
## develop #8926 +/- ##
===========================================
- Coverage 55.01% 55.00% -0.02%
===========================================
Files 646 646
Lines 102242 102030 -212
===========================================
- Hits 56252 56124 -128
+ Misses 45990 45906 -84 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
…gth-problem-and-pypinyin-import
…gth-problem-and-pypinyin-import
|
This Pull Request is stale because it has been open for 60 days with no activity. 当前Pull Request 60天内无活动,被标记为stale。 |
|
This Pull Request is stale because it has been open for 60 days with no activity. 当前Pull Request 60天内无活动,被标记为stale。 |
|
Automatically closed by Paddle-bot. |
PR types
Bug fixes|修改ChineseBertTokenizer不存在Max_length参数问题,以及pypinyin的import方式修改
PR changes
Models|paddlenlp/transformers/chinesebert/tokenizer.py
Description
修改ChineseBertTokenizer不存在Max_length参数问题,以及pypinyin的import方式修改