T-LBERT with Domain Adaptation for Cross-Domain Sentiment Classification

  • Ghadeer Written by
  • Update: 29/12/2022

T-LBERT with Domain Adaptation for Cross-Domain Sentiment Classification

Hongye Cao

School of Software, Northwestern Polytechnical University, China

This email address is being protected from spambots. You need JavaScript enabled to view it.

Qianru Wei

School of Software, Northwestern Polytechnical University, China

This email address is being protected from spambots. You need JavaScript enabled to view it.

Jiangbin Zheng

School of Software, Northwestern Polytechnical University, China

This email address is being protected from spambots. You need JavaScript enabled to view it.

Abstract: Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lacking supervised information for sentiment classification. Existing cross-domain sentiment classification methods establish connections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridge connections across different domains and ignore important sentiment information. Hence, we propose a Topic Lite Bidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve the adaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topic information of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution of information in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attention layer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domain adaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains. Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state-of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.

Keywords: Cross-domain, sentiment classification, topic model, attention, domain adaption.

Received November 1, 2021; accepted April 4, 2022

https://doi.org/10.34028/iajit/20/1/15

 Full text

Read 783 times Last modified on Monday, 02 January 2023 07:03
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…