Supriyono, Supriyono
ORCID: https://orcid.org/0000-0002-4733-9189, Wibawa, Aji Prasetya, Suyono, Suyono and Kurniawan, Fachrul
ORCID: https://orcid.org/0000-0002-3709-8764
(2025)
Enhancing teks summarization of humorous texts with attention-augmented LSTM and discourse-aware decoding.
International Journal of Engineering, Science and Information Technology, 5 (3).
pp. 156-168.
ISSN 2775-2674
|
Text
25662.pdf - Published Version Available under License Creative Commons Attribution. Download (682kB) |
Abstract
Abstractive summarization of humorous narratives presents unique computational challenges due to humor's multimodal, context-dependent nature. Conventional models often fail to preserve the rhetorical structure essential to comedic discourse, particularly the relationship between setup and punchline. This study proposes a novel Attention-Augmented Long Short-Term Memory (LSTM) model with discourse-aware decoding to enhance the summarization of stand-up comedy performances. The model is trained to capture temporal alignment between narrative elements and audience reactions by leveraging a richly annotated dataset of over 10,000 timestamped transcripts, each marked with audience laughter cues. The architecture integrates bidirectional encoding, attention mechanisms, and a cohesion-first decoding strategy to retain humor's structural and affective dynamics. Experimental evaluations demonstrate the proposed model outperforms baseline LSTM and transformer configurations in ROUGE scores and qualitative punchline preservation. Attention heatmaps and confusion matrices reveal the model's capability to prioritize humor-relevant content and align it with audience responses. Furthermore, analyses of laughter distribution, narrative length, and humor density indicate that performance improves when the model adapts to individual performers' pacing and delivery styles. The study also introduces punchline-aware evaluation as a critical metric for assessing summarization quality in humor-centric domains. The findings contribute to advancing discourse-sensitive summarization methods and offer practical implications for designing humor-aware AI systems. This research underscores the importance of combining structural linguistics, behavioral annotation, and deep learning to capture the complexity of comedic communication in narrative texts.
| Item Type: | Journal Article |
|---|---|
| Keywords: | Abstractive Summarization, Attention Mechanism, Cohesion-Aware Decoding, Humor Detection, LSTM. |
| Subjects: | 08 INFORMATION AND COMPUTING SCIENCES > 0801 Artificial Intelligence and Image Processing > 080107 Natural Language Processing 10 TECHNOLOGY > 1099 Other Technology |
| Divisions: | Faculty of Technology > Department of Informatics Engineering |
| Depositing User: | Supriyono Supriyono |
| Date Deposited: | 08 Dec 2025 14:22 |
Downloads
Downloads per month over past year
Origin of downloads
Actions (login required)
![]() |
View Item |
Dimensions
Dimensions