×
This study proposes an AI model that automates the identification of depressive patients. By leveraging Natural Language Processing (NLP) and pre-trained ...
Comparison of KoBERT and BERT for Emotion. Classification of Healthcare Text Data. Mose Gu. Department of Computer Science and Engineering. Sungkyunkwan ...
Jan 26, 2024 · Request PDF | On Oct 11, 2023, Mose Gu and others published Comparison of KoBERT and BERT for Emotion Classification of Healthcare Text Data ...
To process the Korean emotional conversation corpus, we employ. KoBERT, which is a BERT model pre-trained specifically with Korean data. However, the current ...
People also ask
The simulation results show that KoBERT model performed high performance by more than 5% and close to 18% as large as the smallest.
Missing: Emotion | Show results with:Emotion
Sep 14, 2022 · We also proved that pre-trained language models such as KoBERT greatly improved performance compared to the conventional RNN and LSTM, and ...
For this purpose, the health counseling data of Naver Q&A service were crawled as a dataset. KoBERT was used to classify medical subjects according to symptoms ...
May 8, 2024 · Our objective was to compare the performance of unsupervised representations of sequences of disease codes generated by bag-of-words versus sequence-based NLP ...
Sep 28, 2022 · The proposed CMD performs the mapping of a feature space between teacher-student models based on contrastive learning.
Comparison of KoBERT and BERT for Emotion Classification of Healthcare Text Data (conference). Gu, Mose | Jeong, Jaehoon Paul. 2023 14th International ...