Eprint already available on another site (E-prints, Working papers and Research blog)
Cross-Model Semantics in Representation Learning
NIKOOROO, Mohammadsaleh; ENGEL, Thomas
2025
 

Files


Full Text
Cross_M_Semantics.pdf
Author postprint (997.66 kB) Creative Commons License - Attribution, Non-Commercial, No Derivative
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Computer Science - Learning; Computer Science - Artificial Intelligence
Abstract :
[en] The internal representations learned by deep networks are often sensitive to architecture-specific choices, raising questions about the stability, alignment, and transferability of learned structure across models. In this paper, we investigate how structural constraints--such as linear shaping operators and corrective paths--affect the compatibility of internal representations across different architectures. Building on the insights from prior studies on structured transformations and convergence, we develop a framework for measuring and analyzing representational alignment across networks with distinct but related architectural priors. Through a combination of theoretical insights, empirical probes, and controlled transfer experiments, we demonstrate that structural regularities induce representational geometry that is more stable under architectural variation. This suggests that certain forms of inductive bias not only support generalization within a model, but also improve the interoperability of learned features across models. We conclude with a discussion on the implications of representational transferability for model distillation, modular learning, and the principled design of robust learning systems.
Disciplines :
Computer science
Author, co-author :
NIKOOROO, Mohammadsaleh ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
ENGEL, Thomas ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Language :
English
Title :
Cross-Model Semantics in Representation Learning
Publication date :
05 August 2025
Available on ORBilu :
since 10 January 2026

Statistics


Number of views
22 (1 by Unilu)
Number of downloads
4 (0 by Unilu)

OpenCitations
 
0

Bibliography


Similar publications



Contact ORBilu