Skip to content
Isabelle Eysseric edited this page Sep 10, 2024 · 24 revisions

Sentiment Analysis for E-Commerce

Sentiment-analysis image


Team



Scenario

PyCharm with Python 3.7 was used so that the Spacy library could work properly.

We wrote 2 methods to convert the sentences: one that writes the dataset with children/head (1h) and the other without (20 minutes).

In agreement with Professor Luc Lamontagne, to make task 3 go faster, we wrote the converted sentences of the method using the dependency tree of the negation_conversion.py file in text files.

These files were produced with the write_negated method found in the sentiment_analysis.py file. They are stored in the data folder.

Also, it is necessary to download the NLTK corpus, Sentiwordnet so that the sentiment_analyse.py file can work.



Steps

( See file negation_conversion.py )

  • Step 1: Find the scope of the sentence
  • Step 2: Capture the negative scope
  • Step 3: Conversion and reconstruction of the sentence


Results


Figure:



Discussion



References

[0] SpaCy website, article "CORE MODELS : English : en_core_web_sm".

[1] SpaCy website, article "Annotation Specs : Dependencies : Part-of-speech tagging".

[2] SpaCy website, article "GUIDES: Linguistic Features : Dependency Parse: Visualizing dependencies".

[3] SpaCy website, article "GUIDES: Linguistic Features : Dependency Parse: Navigating the parse tree".

[4] SpaCy website, article "Annotation Specs : POS Tagging : Syntactic Dependency Parsing"

[7] Scikit-Learn website, article "Naive Bayes classifier for multinomial models"

[8] Scikit-Learn website, article "Logistic Regression (aka logit, MaxEnt) classifier"

[9] Scikit-Learn website, article "Multi-layer Perceptron classifier"

[10] Scikit-Learn website, article "6.2. Feature extraction"

[17] Scikit-Learn nwebsite, article "sklearn.feature_extraction.text.CountVectorizer.transform"

[11] SpaCy website, article "Annotation Specs : Named Entities : Lexical data for vocabulary"

[12] Universal Dependencies website, article "Universal POS tags"

[13] SpaCy website, article "GUIDES : Adding languages : Language data"

[14] NLTK website, article "SentiWordNet Interface"

[15] Stack Abuse website, article "Python for NLP: Creating Bag of Words Model from Scratch"

[16] SpaCy website, article "User guide"