Profil

MURUGARAJ Keerthana

University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)

ORCID
0009-0008-5100-055X
Main Referenced Co-authors
LAMSIYAH, Salima  (3)
SCHOMMER, Christoph  (2)
DURING, Marten  (1)
THEOBALD, Martin  (1)
Main Referenced Keywords
Comparative Study (1); Extractive Text Summarization (1); HistBERT (1); Historical Domain (1); Historical Text Analysis (1);
Main Referenced Disciplines
Computer science (3)

Publications (total 3)

The most downloaded
59 downloads
K. MURUGARAJ, S. LAMSIYAH, M. DURING, and M. THEOBALD. "Mining the Past: A Comparative Study of Classical and Neural Topic Models on Historical Newspaper Archives." In Association for Computational Linguistics. Albuquerque, United States: Association for Computational Linguistics, 2025. https://hdl.handle.net/10993/64861

The most cited

2 citations (Scopus®)

K. MURUGARAJ, S. LAMSIYAH, and C. SCHOMMER. "Abstractive Summarization of Historical Documents: A New Dataset and Novel Method Using a Domain-Specific Pretrained Model." IEEE Access, 13 (2025): 10918-10932. doi:10.1109/access.2025.3528733 https://hdl.handle.net/10993/63750

Scientific outputs

Articles in scientific journals with peer reviewing verified by ORBi or included in HEC journal guide

K. MURUGARAJ, S. LAMSIYAH, and C. SCHOMMER. "Abstractive Summarization of Historical Documents: A New Dataset and Novel Method Using a Domain-Specific Pretrained Model." IEEE Access, 13 (2025): 10918-10932. doi:10.1109/access.2025.3528733
Peer Reviewed verified by ORBi

Proceedings published in a book or a journal

K. MURUGARAJ, S. LAMSIYAH, M. DURING, and M. THEOBALD. "Mining the Past: A Comparative Study of Classical and Neural Topic Models on Historical Newspaper Archives." In Association for Computational Linguistics. Albuquerque, United States: Association for Computational Linguistics, 2025.
Peer reviewed

S. LAMSIYAH, K. MURUGARAJ, and C. SCHOMMER. "Historical-Domain Pre-trained Language Model for Historical Extractive Text Summarization." In Historical-Domain Pre-trained Language Model for Historical Extractive Text Summarization. London, United Kingdom: https://avestia.com/, 2023. doi:10.11159/cist23.152
Peer reviewed

Contact ORBilu