References of "2016"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailA performance evaluation of weight-constrained conditioned portfolio optimization
Schiltz, Jang UL; Boissaux, Marc

Scientific Conference (2016, December 15)

Detailed reference viewed: 43 (0 UL)
Full Text
Peer Reviewed
See detailA Model-Based Development Environment for Rapid-Prototyping of Latency-Sensitive Automotive Control Software
Sundharam, Sakthivel Manikandan UL; Havet, Lionel; Altmeyer, Sebastian et al

in Proceedings of 6th Intentional Symposium on Embedded computing & system Design (ISED 2016) (2016, December 15)

The innovation in the field of automotive embedded systems has been increasingly relying on software-implemented functions. The control laws of these functions typically assume deterministic sampling ... [more ▼]

The innovation in the field of automotive embedded systems has been increasingly relying on software-implemented functions. The control laws of these functions typically assume deterministic sampling rates and constant delays from input to output. However, on the target processors, the execution times of the software will depend on many factors such as the amount of interferences from other tasks, resulting in varying delays from sensing to actuating. Three approaches supported by tools, namely TrueTime, T-Res, and SimEvents, have been developed to facilitate the evaluation of how timing latencies affect control performance. However, these approaches support the simulation of control algorithms, but not their actual implementation. In this paper, we present a model interpretation engine running in a co-simulation environment to study control performances while considering the run-time delays in to account. Introspection features natively available facilitate the implementation of self-adaptive and fault-tolerance strategies to mitigate and compensate the run-time latencies. A DC servo controller is used as a supporting example to illustrate our approach. Experiments on controller tasks with injected delays show that our approach is on par with the existing techniques with respect to simulation. We then discuss the main benefits of our development approach that are the support for rapid-prototyping and the re-use of the simulation model at run-time, resulting in productivity and quality gains. [less ▲]

Detailed reference viewed: 239 (41 UL)
Full Text
Peer Reviewed
See detailA variational formulation of dissipative quasicontinuum methods
Rokos, Ondrej; Beex, Lars UL; Peerlings, Ron et al

in International Journal of Solids and Structures (2016), 102-103

Lattice systems and discrete networks with dissipative interactions are successfully employed as meso-scale models of heterogeneous solids. As the application scale generally is much larger than that of ... [more ▼]

Lattice systems and discrete networks with dissipative interactions are successfully employed as meso-scale models of heterogeneous solids. As the application scale generally is much larger than that of the discrete links, physically relevant simulations are computationally expensive. The QuasiContinuum (QC) method is a multiscale approach that reduces the computational cost of direct numerical simulations by fully resolving complex phenomena only in regions of interest while coarsening elsewhere. In previous work (Beex et al., J. Mech. Phys. Solids 64, 154-169, 2014), the originally conservative QC methodology was generalized to a virtual-power-based QC approach that includes local dissipative mechanisms. In this contribution, the virtual-power-based QC method is reformulated from a variational point of view, by employing the energy-based variational framework for rate-independent processes (Mielke and Roub cek, Rate-Independent Systems: Theory and Application, Springer-Verlag, 2015). By construction it is shown that the QC method with dissipative interactions can be expressed as a minimization problem of a properly built energy potential, providing solutions equivalent to those of the virtual-power-based QC formulation. The theoretical considerations are demonstrated on three simple examples. For them we verify energy consistency, quantify relative errors in energies, and discuss errors in internal variables obtained for different meshes and two summation rules. [less ▲]

Detailed reference viewed: 149 (8 UL)
Peer Reviewed
See detailN-point Virasoro algebras are multi-point Krichever-Novikov type algebras
Schlichenmaier, Martin UL

Scientific Conference (2016, December 14)

Detailed reference viewed: 109 (3 UL)
Full Text
Peer Reviewed
See detailAdaptive Control of Hysteretic Robotic arm in Operational Space
Kannan, Somasundar UL; Quintanar Guzman, Serket UL; Bezzaoucha, Souad UL et al

in 5th International Conference on Mechatronics and Control Engineering ICMCE, venice, Italy, 2016 (2016, December 14)

The focus of the current article is on Operational Space Control of a single degree of freedom robotic arm with hysteretic joint behaviour due to actuation by a single Shape Memory Alloy (SMA) wire. A ... [more ▼]

The focus of the current article is on Operational Space Control of a single degree of freedom robotic arm with hysteretic joint behaviour due to actuation by a single Shape Memory Alloy (SMA) wire. A Closed Loop Inverse Kinematics Algorithm is used in the outer loop with Adaptive joint control in the inner loop. A composite stability analysis is used to analyse the stability of the closed loop system and finally successfully validated through simulation study. [less ▲]

Detailed reference viewed: 115 (9 UL)
Full Text
Peer Reviewed
See detailUsing Virtual Desktop Infrastructure to Improve Power Efficiency in Grinfy System
Ibrahim, Abdallah Ali Zainelabden Abdallah UL; Kliazovich, Dzmitry UL; Bouvry, Pascal UL et al

in IEEE 8th International Conference on Cloud Computing Technology and Science(CloudCom), Luxembourg 2016 (2016, December 13)

Saving power becomes one of the main objectives in information technology industry and research. Companies consume a lot of money in the shape of power consuming. Virtual Desktop Infrastructure (VDI) is a ... [more ▼]

Saving power becomes one of the main objectives in information technology industry and research. Companies consume a lot of money in the shape of power consuming. Virtual Desktop Infrastructure (VDI) is a new shape of delivering operating systems remotely. Operating systems are executing in a cloud data center. Users desktops and applications can be accessed by using thin client devices. Thin client device is consisting of screen attached with small CPU. VDI has benefits in terms of cost reduction and energy saving. In this paper, we increase the power saved by Grinfy system. Without VDI, Grinfy can save at least 30% of energy consumption to its users companies. By integrating VDI in computing systems and using Grinfy, the power efficiency and saving can be improved and save more than 30%. The improving and increasing of energy saving features of VDI are also illustrated by experiment and will be integrated to Grinfy system to increase percentage of energy saved. [less ▲]

Detailed reference viewed: 201 (9 UL)
See detailL’artiste, l’écrit et le monument Signatures épigraphiques en France au Moyen Âge central
Mineo, Emilie UL

Doctoral thesis (2016)

La thèse de doctorat a été consacrée à l’étude d’un ensemble d'inscriptions françaises des XIe et XIIe siècles par lesquelles un individu s’attribue la réalisation d’un ouvrage artistique ou monumental ... [more ▼]

La thèse de doctorat a été consacrée à l’étude d’un ensemble d'inscriptions françaises des XIe et XIIe siècles par lesquelles un individu s’attribue la réalisation d’un ouvrage artistique ou monumental. En renouvelant l’approche traditionnelle de ces « signatures », liée plutôt à une démarche attributive, et en envisageant ces documents épigraphiques dans leur double dimension de texte et d’objet, en ont été sondés les mécanismes de production et les fonctions. L’étude est introduite par un bilan historiographique et une réflexion théorique sur la signature épigraphique, qui y est redéfinie comme « une inscription dont la fonction est d’assigner à un ou plusieurs individus, au moyen d’un énoncé verbal, la responsabilité d’une oeuvre à laquelle l’inscription est matériellement et/ou visuellement rattachée » (sont donc exclus de l’inventaire les anthroponymes isolés et les marques lapidaires ou glyptographes). La première partie de la thèse aborde le problème de la valeur auctoriale de ces inscriptions. Les textes des signatures épigraphiques recensées livrent en effet des informations extrêmement laconiques sur l’identité du signataire et sur son rôle dans la fabrication de l’oeuvre. Il est donc délicat d’attribuer systématiquement ces attestations à l’artiste, d’autant plus qu’une telle catégorie de pensée apparaît comme étrangère à la mentalité du Moyen Âge central. En outre, le problème de la connaissance de l'écrit de la part des artistes avait jusqu’à présent été négligé, alors qu’il est clairement en lien avec la réalisation des signatures épigraphiques en tant que produits écrits. La seconde partie de la thèse s’attache donc à évaluer la capacité à écrire de ceux qui ont exécuté ces inscriptions et à sonder les contextes au sein desquels il leur était possible d’acquérir les compétences culturelles et techniques nécessaires. L’analyse technique et linguistique de cette documentation a permis de démontrer des situations d’alphabétisation variées, mais attestant toujours un rapport actif à l’écrit de la part de l’artisan chargé de l’inscription, contrairement à ce que l’on pensait communément auparavant. Un troisième et dernier volet de la réflexion a porté sur les enjeux de l’inscription publique du nom d’un individu en association à une oeuvre monumentale. En étudiant la mise en scène visuelle des signatures, il est apparu que leurs caractéristiques matérielles et topographiques ne sont pas toujours en adéquation avec l’objectif de publicité universelle qu’on leur a souvent reconnu et qui ferait de ces inscriptions une tribune d’affichage de l’artiste et de son talent. Par cette analyse des conditions de réception et la mise en relation des signatures avec les pratiques liturgiques de nomination, est donc proposée, pour le contexte étudié (France, XIe-XIIe s.), une interprétation de ce geste singulier d’écriture dans une perspective avant tout ecclésiologique et eschatologique. Dans le volume d’annexes de la thèse, chacune des 51 inscriptions encore conservées et étudiées dans la synthèse est présentée sous la forme d’une notice analytique. Ce catalogue est suivi d’un tableau synthétisant les données relatives au métier et au statut social des artisans de la base ArtiChO (Artistes et artisans dans les chartes antérieures à 1121 conservées en France) conçue pour organiser et exploiter les 140 actes issus de la base TELMA (http://www.cn-telma.fr/originaux/index/) mentionnant, à titre divers, des artistes et artisans. [less ▲]

Detailed reference viewed: 22 (0 UL)
See detailLa grande région SaarLorLux, riche de ses travailleurs frontaliers
Pigeron-Piroth, Isabelle UL; Belkacem, Rachid

E-print/Working paper (2016)

Detailed reference viewed: 105 (7 UL)
Peer Reviewed
See detailL'enseignement transsystémique du droit des contrats: l'expérience luxembourgeoise
Ancel, Pascal UL

Scientific Conference (2016, December 13)

Detailed reference viewed: 108 (5 UL)
Full Text
See detailCohomologies and derived brackets of Leibniz algebras
Cai, Xiongwei UL

Doctoral thesis (2016)

In this thesis, we work on the structure of Leibniz algebras and develop cohomology theories for them. The motivation comes from: • Roytenberg, Stienon-Xu and Ginot-Grutzmann's work on standard and naive ... [more ▼]

In this thesis, we work on the structure of Leibniz algebras and develop cohomology theories for them. The motivation comes from: • Roytenberg, Stienon-Xu and Ginot-Grutzmann's work on standard and naive cohomology of Courant algebroids (Courant-Dorfman algebras). • Kosmann-Schwarzbach, Roytenberg and Alekseev-Xu's constructions of derived brackets for Courant algebroids. • The classical equivariant cohomology theory and the generalized geometry theory. This thesis consists of three parts: 1. We introduce standard cohomology and naive cohomology for a Leibniz algebra. We discuss their properties and show that they are isomorphic. By similar methods, we prove a generalization of Ginot-Grutzmann's theorem on transitive Courant algebroids, which was conjectured by Stienon-Xu. The relation between standard complexes of a Leibniz algebra and its corresponding crossed product is also discussed. 2. We observe a canonical 3-cochain in the standard complex of a Leibniz algebra. We construct a bracket on the subspace consisting of so-called representable cochains, and prove that the subspace becomes a graded Poisson algebra. Finally we show that for a fat Leibniz algebra, the Leibniz bracket can be represented as a derived bracket. 3. In spired by the notion of a Lie algebra action and the idea of generalized geometry, we introduce the notion of a generalized action of a Lie algebra g on a smooth manifold M, to be a homomorphism of Leibniz algebras from g to the generalized tangent bundle TM+T*M. We define the interior product and Lie derivative so that the standard complex of TM+T*M becomes a g differential algebra, then we discuss its equivariant cohomology. We also study the equivariant cohomology for a subcomplex of a Leibniz complex. [less ▲]

Detailed reference viewed: 164 (17 UL)
Full Text
Peer Reviewed
See detailAvoiding Leakage and Synchronization Attacks through Enclave-Side Preemption Control
Volp, Marcus UL; Lackorzynski, Adam; Decouchant, Jérémie UL et al

Scientific Conference (2016, December 12)

Intel SGX is the latest processor architecture promising secure code execution despite large, complex and hence potentially vulnerable legacy operating systems (OSs). However, two recent works identified ... [more ▼]

Intel SGX is the latest processor architecture promising secure code execution despite large, complex and hence potentially vulnerable legacy operating systems (OSs). However, two recent works identified vulnerabilities that allow an untrusted management OS to extract secret information from Intel SGX's enclaves, and to violate their integrity by exploiting concurrency bugs. In this work, we re-investigate delayed preemption (DP) in the context of Intel SGX. DP is a mechanism originally proposed for L4-family microkernels as disable-interrupt replacement. Recapitulating earlier results on language-based information-flow security, we illustrate the construction of leakage-free code for enclaves. However, as long as adversaries have fine-grained control over preemption timing, these solutions are impractical from a performance/complexity perspective. To overcome this, we resort to delayed preemption, and sketch a software implementation for hypervisors providing enclaves as well as a hardware extension for systems like SGX. Finally, we illustrate how static analyses for SGX may be extended to check confidentiality of preemption-delaying programs. [less ▲]

Detailed reference viewed: 356 (29 UL)
Full Text
See detailElastography under uncertainty
Hale, Jack UL; Farrell, Patrick; Bordas, Stéphane UL

Poster (2016, December 12)

Detailed reference viewed: 206 (11 UL)
Full Text
See detailEssays on Inequality, Public Policy, and Banking
Mavridis, Dimitrios UL

Doctoral thesis (2016)

Detailed reference viewed: 75 (14 UL)
Peer Reviewed
See detailN-point Virasoro algebras are multi-point Krichever-Novikov type algebras
Schlichenmaier, Martin UL

Scientific Conference (2016, December 12)

Detailed reference viewed: 108 (3 UL)
Full Text
See detailOn the Impact of Multi-GNSS Solutions on Satellite Products and Positioning
Abraha, Kibrom Ebuy UL; Teferle, Felix Norman UL; Hunegnaw, Addisu UL et al

Poster (2016, December 12)

In Global Navigation Satellite System (GNSS) coordinate time series unrecognised errors and un-modelled (periodic) effects may bias non-linear motions induced by geophysical signals. Those spurious ... [more ▼]

In Global Navigation Satellite System (GNSS) coordinate time series unrecognised errors and un-modelled (periodic) effects may bias non-linear motions induced by geophysical signals. Those spurious signals can be caused either due to un-modelled long periodic signals or propagation of sub-daily signals into the time series. Understanding and mitigating these errors is vital to reduce biases and on revealing subtle geophysical signals. Mostly, the spurious signals are caused by unmodelled errors which occur due to the draconitic years, satellite ground repeats and absorption into resonant GNSS orbits. Accordingly, different features can be observed in GNSS-derived products from different single-GNSS or combined-GNSS solutions. To assess the nature of periodic signals on station coordinate time series Precise Point Positioning (PPP) solutions are generated using the Bernese GNSS Software V5.2. The solutions consider only GPS, only GLONASS or combined GPS+GLONASS (GNSS) observations. We assess the periodic signals of station coordinates computed using the combined International GNSS Service (IGS) and four of its Analysis Centers (ACs) products. [less ▲]

Detailed reference viewed: 205 (26 UL)
Full Text
See detailBayesian inference for parameter identification in computational mechanics
Rappel, Hussein UL; Beex, Lars UL; Hale, Jack UL et al

Poster (2016, December 12)

Detailed reference viewed: 182 (10 UL)
Full Text
See detailReal-time error control for surgical simulation
Bui, Huu Phuoc UL; Tomar, Satyendra UL; Courtecuisse, Hadrien et al

Poster (2016, December 12)

Objective: To present the first real-time a posteriori error-driven adaptive finite element approach for real-time simulation and to demonstrate the method on a needle insertion problem. Methods: We use ... [more ▼]

Objective: To present the first real-time a posteriori error-driven adaptive finite element approach for real-time simulation and to demonstrate the method on a needle insertion problem. Methods: We use corotational elasticity and a frictional needle/tissue interaction model based on friction. The problem is solved using finite elements within SOFA. The refinement strategy relies upon a hexahedron-based finite element method, combined with a posteriori error estimation driven local $h$-refinement, for simulating soft tissue deformation. Results: We control the local and global error level in the mechanical fields (e.g. displacement or stresses) during the simulation. We show the convergence of the algorithm on academic examples, and demonstrate its practical usability on a percutaneous procedure involving needle insertion in a liver. For the latter case, we compare the force displacement curves obtained from the proposed adaptive algorithm with that obtained from a uniform refinement approach. Conclusions: Error control guarantees that a tolerable error level is not exceeded during the simulations. Local mesh refinement accelerates simulations. Significance: Our work provides a first step to discriminate between discretization error and modeling error by providing a robust quantification of discretization error during simulations. [less ▲]

Detailed reference viewed: 246 (20 UL)
Full Text
Peer Reviewed
See detailA Probabilistic View of Neighborhood-based Recommendation Methods
Wang, Jun UL; Tang, Qiang

in ICDM 2016 - IEEE International Conference on Data Mining series (ICDM) workshop CLOUDMINE (2016, December 12)

Probabilistic graphic model is an elegant framework to compactly present complex real-world observations by modeling uncertainty and logical flow (conditionally independent factors). In this paper, we ... [more ▼]

Probabilistic graphic model is an elegant framework to compactly present complex real-world observations by modeling uncertainty and logical flow (conditionally independent factors). In this paper, we present a probabilistic framework of neighborhood-based recommendation methods (PNBM) in which similarity is regarded as an unobserved factor. Thus, PNBM leads the estimation of user preference to maximizing a posterior over similarity. We further introduce a novel multi-layer similarity descriptor which models and learns the joint influence of various features under PNBM, and name the new framework MPNBM. Empirical results on real-world datasets show that MPNBM allows very accurate estimation of user preferences. [less ▲]

Detailed reference viewed: 156 (10 UL)
Full Text
See detailError analysis of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center stacked solutions
Hunegnaw, Addisu UL; Teferle, Felix Norman UL; Abraha, Kibrom Ebuy UL et al

Poster (2016, December 12)

In 2013 the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) started their reprocessing campaign, which proposes to re-analyze all relevant Global Positioning ... [more ▼]

In 2013 the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) started their reprocessing campaign, which proposes to re-analyze all relevant Global Positioning System (GPS) observations from 1995 to the end of 2013. This re-processed dataset will provide high quality estimates of land motions, enabling regional and global high-precision geophysical/geodetic studies. Several of the individual TIGA Analysis Centers (TACs) have completed processing the full history of GPS observations recorded by the IGS global network, as well as, many other GPS stations at or close to tide gauges, which are available from the TIGA data center at the University of La Rochelle (www.sonel.org). The TAC solutions contain a total of over 700 stations. This study focuses on the evaluations of any systematic error present in the three TIGA analysis center (TAC) SINEX solutions: the British Isles continuous GNSS Facility – University of Luxembourg consortium (BLT), the GeoForschungsZentrum (GFZ) Potsdam, and of the University of La Rochelle (ULR). We have analyzed the residual position time series of the individual TAC a combination of automatic and manual discontinuity identification, applying a post-seismic deformation model adopted from ITRF2014 for those stations that are affected by earthquakes, followed by the stacking of the daily solution of the individual TAC into a long term linear frame. We have carried out the error analysis using the Combination and Analysis of Terrestrial Reference Frame (CATREF) software package. The TIGA Combination Centre (TCC) at the University of Luxembourg (UL) is responsible for providing a combined solution with a global set of vertical land movement estimates. [less ▲]

Detailed reference viewed: 211 (26 UL)