6 skills found
gmftbyGMFTBY / MultiTurnDialogZooMulti-turn dialogue baselines written in PyTorch
Sanghoon94 / DailyDialogue ParserParser for DailyDialogue Dataset
ntunlplab / LifeEventDialogLife Event Dialog contains fine-grained personal life event annotations on DailyDialog.
ntunlplab / Lifelog DiaLogConversation, a common way for people to share their experiences and feelings with others, consists of important information about personal life events of individuals, but is rarely explored. In this dataset, we initiate a task of detecting personal life events from daily conversaion. We extend a multi-turn dialog dataset, DailyDialog, with life event annotation. We collect 600 conversations with 4-6 utterances from 4 topics of DailyDialog. Our goal is to detect the life events of each speaker in real-time.
LaDerawk / LSTM For Next Word PredictionLSTM + Attention model for next-word prediction using DailyDialog.
nikh03 / Emotion Recognition Using Natural Language Processing And BERTEmotion is one of the important aspects of our lives that influences day-to-day activities including social behavior, friendship, family, work, and many others. Being able to recognize emotions can help us in various domains such as psychology, human-computer interaction, e-learning, marketing and many more. We did a comparative study on different approaches i.e., Traditional Machine Learning approach, Neural Network approach and Bidirectional Encoder Representations from Transformers (BERT) to evaluate which approach gives us the maximum accuracy, which has been evaluated on a dataset that was combined from dailydialog, isear, and emotion-stimulus datasets to create a balanced dataset with five labels: joy, sad, anger, fear, and neutral. We have targeted implicit emotion recognition which is a challenging problem because emotion is hidden within the text and solution to it is identifying the context of the text. The output generated will be emotions identified from the text classifyed into five labels joy, sadness, fear, anger and neutral. The BERT based model achieved 91.72% accuracy which is significantly greater than traditional machine learning, neural network approaches.