ModelScope model library experience Chinese StructBERT series pre-trained language model

Introduction: StructBERT proposes improvement and optimization on the basis of BERT. By introducing two new objective functions at the sentence level and the word level, the order of sentences/words is disrupted and the model restores it, which can make the machine better. Mastering human grammar and deepening the understanding of natural language enables the model to learn stronger language structure information.

ModelScope

The ModelScope platform brings together industry-leading pre-training models and provides various types of high-quality models in an open source manner. Developers can experience and download them for free on the platform.

Introduction to StructBERT

StructBERT proposes improvement and optimization on the basis of BERT. By introducing two new objective functions at sentence level and word level, the order of sentences/words is disrupted and the model restores them, so that machines can better grasp human beings Grammar, deepening the understanding of natural language, so that the model can learn stronger language structure information.

Stable improvement over BERT on public datasets

Chinese series StructBERT

ModelScope has open-sourced Chinese StructBERT models of various sizes, which are summarized in the following table. We can use models of different sizes as needed.

We can specify the modei id in the code to call, or we can pull the model repository to the local for use.

Tuning StructBERT in ModelScope

We can use our own data to perform training tuning (fintune) on the basis of StructBERT, and customize our own personalized model. In ModelScope, custom tuning can be achieved in ten lines of code.

Environmental preparation

• If you need to run the model locally, you need to prepare the corresponding environment installation, including: installing the python environment (python>=3.7), installing the deep learning framework (pytorch or tensorflow), and installing the ModelScope Library. For details, please refer to the official documentation: Environment Installation.

If you think the local installation is more complicated, the ModelScope platform also provides an online runtime environment, you can run it directly in Notebook. ModelScope also provides free computing resources. For the specific quota, please refer to the official document: Description of Free Quota.

Configure training

• Configuration files can be used in Modelscope to set processes such as data preprocessing, model training, and model evaluation. The names of configuration files such as models in the model warehouse are configuration.json

• In the configuration file, we can set relevant fields required for training, such as running framework (pytorch/tensorflow), task name (classification, similarity, etc.), preprocessing process (preprocessor), and training optimizer (optimizer); For detailed field descriptions, please refer to Configuration Details

• Here is an example configuration file where we use StructBERT to train a sentence similarity model

Ten lines of code to start training

The following afqmc (Ant Financial Question Matching Corpus) dataset is an example to demonstrate model training based on StructBERT.

After running the code, we can easily observe the training progress (such as loss changes, memory usage, training duration, etc.) in the log.

For more finetune settings of ModelScope, you can refer to the document: Best Practices for Text Classification Finetune

ModelScope experience feedback

• Overall, ModelScope has open sourced high-quality models for many important scenarios, allowing developers to develop from a higher starting point (such as pre-trained models such as StructBERT) in combination with their own scenarios.

• On the other hand, ModelScope is still in its infancy, and some work has yet to be built (such as more and more meaningful real data sets), and it is expected that ModelScope will become better and better

Related Articles

Explore More Special Offers

  1. Short Message Service(SMS) & Mail Service

    50,000 email package starts as low as USD 1.99, 120 short messages start at only USD 1.00

phone Contact Us