## Chinese BERT with Whole Word Masking### Please use 'Bert' related functions to load this model!For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping HuThis repository is developed based on:https://github.com/google-research/bertYou may also interested in,- Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm- Chinese MacBERT: https://github.com/ymcui/MacBERT- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewerMore resources by HFL: https://github.com/ymcui/HFL-Anthology## How to Use```python!pip install --upgrade paddlenlp``````pythonimport paddlefrom paddlenlp.transformers import AutoModelmodel = AutoModel.from_pretrained("hfl/chinese-roberta-wwm-ext-large")input_ids = paddle.randint(100, 200, shape=[1, 20])print(model(input_ids))```## CitationIf you find the technical report or resource is useful, please cite the following technical report in your paper.- Primary: https://arxiv.org/abs/2004.13922```@inproceedings{cui-etal-2020-revisiting,title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",author = "Cui, Yiming andChe, Wanxiang andLiu, Ting andQin, Bing andWang, Shijin andHu, Guoping",booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",month = nov,year = "2020",address = "Online",publisher = "Association for Computational Linguistics",url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",pages = "657--668",}```- Secondary: https://arxiv.org/abs/1906.08101```@article{chinese-bert-wwm,title={Pre-Training with Whole Word Masking for Chinese BERT},author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping},journal={arXiv preprint arXiv:1906.08101},year={2019}}```> 此模型介绍及权重来源于[https://huggingface.co/hfl/chinese-roberta-wwm-ext-large](https://huggingface.co/hfl/chinese-roberta-wwm-ext-large),并转换为飞桨模型格式。