×
Community Blog EasyTransfer - A Simple and Scalable Deep Transfer Learning Platform for NLP Applications

EasyTransfer - A Simple and Scalable Deep Transfer Learning Platform for NLP Applications

This paper introduces EasyTransfer, a toolkit designed by Alibaba Cloud and Zhejiang University to make the development of transfer learning in NLP applications easier.

By Minghui Qiu, Peng Li, Hanjie Pan, Chengyu Wang, Ang Wang, Cen Chen, Yaliang Li, Dehong Gao, Jun Huang, Yong Li, Jun Yang, Deng Cai, and Wei Lin, from Alibaba Group and Zhejiang University, China

To read the full paper on EasyTransfer, please visit this page

Transfer Learning (TL) is a rapidly growing field of machine learning that aims to improve the learning of a data-deficient task by transferring knowledge from related data-sufficient tasks. Witnessing the great representation learning abilities of deep neural networks, neural architectures based TL methods, i.e., deep transfer learning, have gained increasing popularity and are shown to be effective for a wide variety of applications.

Several TL toolkits have also been developed to make it easy to apply TL algorithms. Notable projects include:

  • The NVIDIA Transfer Learning Toolkit (TLT) is a python based AI toolkit for training AI models and customizing them with users' own datasets. However, it mainly focuses on the computer vision field.
  • Amazon Xfer is an MXNet library which largely automates deep TL. It contains the "ModelHandler" component to extract features from pre-trained models and the "Repurposer" component to re-purpose models for target tasks.
  • Tsinghua Transfer Learning Toolkit is an integrated interface for 17 TL models written by python. It includes five types of models, namely "feature-based", "concept-based", "parameter-based", "instance-based" and "deep-learning-based".
  • The Huggingface Transformers toolkit specifically addresses model-finetuning, especially for BERT-like models. It is backended by PyTorch and Tensorflow and integrates 30+ pretrained language models.

Challenges of Existing Solutions

However, when it comes to industrial-scale real-world applications, the above mentioned toolkits might be less ideal. The reasons are threefold.

  1. Deep learning models are getting larger and larger, which makes it difficult to deploy those models in real-time applications. For example, pre-trained contextual representation encoders, such as BERT, RoBERTa and GPT, have been widely adopted in a variety of Natural Language Processing (NLP) tasks. Despite their effectiveness, these models are built upon large-scale datasets and usually have parameters in the billion scale.
  2. There are a variety of TL algorithms proposed in literature, yet no comprehensive TL toolkit is available for users to examine different types of state-of-the-art TL algorithms.
  3. A huge gap still exists between developing a fancy algorithm for a specific task and deploying the algorithm for online production. For many online applications, it is still a non-trivial task to provide a reliable service with high QPS requirements. In a nutshell, it is highly necessary to develop a comprehensive, industry-scale deep TL toolkit.

The EasyTransfer Toolkit

To bridge this gap, we developed the EasyTransfer toolkit and release it to the open-source community. EasyTransfer is built with highly scalable distributed training strategies, which make it easy to facilitate large-scale model training. It supports a comprehensive suite of TL algorithms that can be used in various NLP task, providing a unified pipeline of model training, inference and deployment for real-world applications.

The toolkit is open-sourced in Github. Currently, we have integrated EasyTransfer into a number of deep learning products in Alibaba and observed notable performance gains.

Major Contributions of EasyTransfer

  • We are the first to propose a simple and scalable deep TL platform to make it easy to develop deep TL algorithms for NLP applications.
  • EasyTransfer supports mainstream TL algorithms, divided into five categories, namely model fine-tuning, feature-based TL, instance-based TL, model-based TL and meta learning.
  • EasyTransfer is equipped with ModelZoo, containing more than 20 mainstream pre-trained language models and a multimodality model.
  • EasyTransfer further integrates AppZoo, the collection of state-of-the-art models for mainstream NLP applications and simple user interfaces to support these applications.
  • EasyTransfer is seamlessly integrated with Alibaba PAI products 7 , to make it easy for users to conduct model training, evaluation and online deployment on the cloud.

The EasyTransfer toolkit has also been deployed at Alibaba to support a variety of business scenarios, including item recommendation, personalized search and conversational question answering. EasyTransfer is released under the Apache 2.0 License and has been open sourced at GitHub. The detailed documentation and tutorials are available on this link https://www.yuque.com/easytransfer/cn

To read the full paper on EasyTransfer, please visit https://arxiv.org/abs/2011.09463

0 0 0
Share on

You may also like

Comments

Related Products