Workshop Proceedings of the 19th International AAAI Conference on Web and Social Media

Workshop: CySoc 2025: 6th International Workshop on Cyber Social Threats

DOI: 10.36190/2025.17

Published: 2025-06-05
StressSGCL: A Stress-Specific Embeddings Learning Approach using Contrastive Learning over Skip-Gram Model
Muhammad Abulaish, Rumjot Kaur, Amit Kumar Sah

In this paper, we present StressSGCL, a novel approach that produces stress-specific word embeddings to enhance stress detection in social media posts. The proposed approach improves contextual representation by integrating the skip-gram model with transformers-based Pre-trained Language Models (PLMs). Subsequently, it refines these embeddings through supervised contrastive learning, which focuses on the differentiation of stress-related emotional content. The StressSGCL is evaluated using two PLMs (BERT and MentalBERT) on three benchmark datasets from Twitter and Reddit. Empirical evaluation employing the 10-fold cross-validation demonstrates that the stress-specific embeddings produced by StressSGCL consistently outperform various ablation baselines and existing stress detection methods. The StressSGCL advances mental health text analysis by demonstrating how static embedding techniques can effectively supplement dynamic PLMs to improve performance when dealing with domain-specific problems such as stress detection.