Improving the performance of aspect based sentiment analysis using fine-tuned Bert Base Uncased model

Nowadays, digital reviews and ratings of E-commerce platforms provide a better way for consumers to buy the products. E-commerce giants like Amazon, Flipkart, etc provide customers with a forum to share their experience and provide potential consumers with true evidence of the product's outcome...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of intelligent networks Vol. 2; pp. 64 - 69
Main Authors Geetha, M.P., Karthika Renuka, D.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 2021
KeAi Communications Co., Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Nowadays, digital reviews and ratings of E-commerce platforms provide a better way for consumers to buy the products. E-commerce giants like Amazon, Flipkart, etc provide customers with a forum to share their experience and provide potential consumers with true evidence of the product's outcomes. To obtain useful insights from a broad collection of reviews, it is important to separate reviews into positive and negative feelings. In the proposed work, Sentiment Analysis is to be done on the consumer review data and categorize into positive and negative feelings. Naïve Bayes Classification, LSTM and Support Vector Machine (SVM) were employed for the classification of reviews from the various classification models. Many of the current SA techniques for these customer online product review text data have low accuracy and often takes longer time in the course of training. In this research work, BERT Base Uncased model which is a powerful Deep Learning Model is presented to elucidate the issue of Sentiment Analysis. The BERT model gave an improved performance with good prediction and high accuracy compared to the other methods of Machine Learning in the experimental evaluation. •Sentiment analysis.•BERT.•Natural language processing.•Product reviews.•Machine Learning.
ISSN:2666-6030
2666-6030
DOI:10.1016/j.ijin.2021.06.005