SONG, Y., EZZINI, S., TANG, X., LOTHRITZ, C., KLEIN, J., BISSYANDE, T. F. D. A., BOYTSOV, A., BLE, U., & GOUJON, A. (2024). Enhancing Text-to-SQL Translation for Financial System Design. In ICSE-SEIP '24: Proceedings of the 46th International Conference on Software Engineering: Software Engineering in Practice (pp. 11). New York, United States: Institute of Electrical and Electronics Engineers Inc. doi:10.1145/3639477.3639732 Peer reviewed |
LOTHRITZ, C., EZZINI, S., PURSCHKE, C., BISSYANDE, T. F. D. A., KLEIN, J., Olariu, I., Boytsov, A., Lefebvre, C., & Goujon, A. (2023). Comparing Pre-Training Schemes for Luxembourgish BERT Models. In Proceedings of the 19th Conference on Natural Language Processing (KONVENS 2023). Peer reviewed |
LOTHRITZ, C., LEBICHOT, B., ALLIX, K., EZZINI, S., BISSYANDE, T. F. D. A., KLEIN, J., Boytsov, A., Lefebvre, C., & Goujon, A. (2023). Evaluating the Impact of Text De-Identification on Downstream NLP Tasks. In Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa). Tartu, Estonia: University of Tartu Library. Peer reviewed |
LOTHRITZ, C. (2023). NLP De Luxe - Challenges for Natural Language Processing in Luxembourg [Doctoral thesis, Unilu - University of Luxembourg]. ORBilu-University of Luxembourg. https://orbilu.uni.lu/handle/10993/54910 |
OLARIU, I., LOTHRITZ, C., KLEIN, J., BISSYANDE, T. F. D. A., Guo, S., & Haddadan, S. (2023). Evaluating Parameter-Efficient Finetuning Approaches for Pre-trained Models on the Financial Domain. In Empirical Methods in Natural Language Processing. Association for Computational Linguistics. Peer reviewed |
OLARIU, I., LOTHRITZ, C., BISSYANDE, T. F. D. A., & KLEIN, J. (2023). Evaluating Data Augmentation Techniques for the Training of Luxembourgish Language Models. In Conference on Natural Language Processing. KONVENS. Peer reviewed |
LOTHRITZ, C., LEBICHOT, B., ALLIX, K., VEIBER, L., BISSYANDE, T. F. D. A., KLEIN, J., Boytsov, A., Goujon, A., & Lefebvre, C. (2022). LuxemBERT: Simple and Practical Data Augmentation in Language Model Pre-Training for Luxembourgish. In Proceedings of the Language Resources and Evaluation Conference, 2022 (pp. 5080-5089). Peer reviewed |
LOTHRITZ, C., ALLIX, K., LEBICHOT, B., VEIBER, L., BISSYANDE, T. F. D. A., & KLEIN, J. (2021). Comparing MultiLingual and Multiple MonoLingual Models for Intent Classification and Slot Filling. In 26th International Conference on Applications of Natural Language to Information Systems (pp. 367-375). Springer. doi:10.1007/978-3-030-80599-9_32 Peer reviewed |
ARSLAN, Y., ALLIX, K., VEIBER, L., LOTHRITZ, C., BISSYANDE, T. F. D. A., KLEIN, J., & Goujon, A. (2021). A Comparison of Pre-Trained Language Models for Multi-Class Text Classification in the Financial Domain. In Companion Proceedings of the Web Conference 2021 (WWW '21 Companion), April 19--23, 2021, Ljubljana, Slovenia (pp. 260–268). New York, United States: Association for Computing Machinery. doi:10.1145/3442442.3451375 Peer reviewed |
LOTHRITZ, C., ALLIX, K., VEIBER, L., KLEIN, J., & BISSYANDE, T. F. D. A. (2020). Evaluating Pretrained Transformer-based Models on the Task of Fine-Grained Named Entity Recognition. In Proceedings of the 28th International Conference on Computational Linguistics (pp. 3750–3760). Peer reviewed |