Last 7 days
Bookmark and Share    
Full Text
Peer Reviewed
See detailTowards the Application of Neuromorphic Computing to Satellite Communications
Ortiz Gomez, Flor de Guadalupe UL; Lagunas, Eva UL; Alves Martins, Wallace UL et al

in Towards the Application of Neuromorphic Computing to Satellite Communications (2022, October)

Artificial intelligence (AI) has recently received significant attention as a key enabler for future 5G-and-beyond terrestrial wireless networks. The applications of AI to satellite communications is also ... [more ▼]

Artificial intelligence (AI) has recently received significant attention as a key enabler for future 5G-and-beyond terrestrial wireless networks. The applications of AI to satellite communications is also gaining momentum to realize a more autonomous operation with reduced requirements in terms of human intervention. The adoption of AI for satellite communications will set new requirements on computing processors, which will need to support large workloads as efficiently as possible under harsh environmental conditions. In this context, neuromorphic processing (NP) is emerging as a bio-inspired solution to address pattern recognition tasks involving multiple, possibly unstructured, temporal signals and/or requiring continual learning. The key merits of the technology are energy efficiency and capacity for on-device adaptation. In this paper, we highlight potential use cases and applications of NP to satellite communications. We also explore major technical challenges for the implementation of space-based NP focusing on the available NP chipsets. [less ▲]

Detailed reference viewed: 38 (1 UL)
Full Text
Peer Reviewed
See detailA multilingual dataset of COVID-19 vaccination attitudes on Twitter
Chen, Ninghan UL; Chen, Xihui UL; Pang, Jun UL

in Data in Brief (2022), 44

Detailed reference viewed: 19 (1 UL)
Full Text
Peer Reviewed
See detailNon-Coherent Massive MIMO Integration in Satellite Communication
Monzon Baeza, Victor UL; Ha, Vu Nguyen UL; Querol, Jorge UL et al

Scientific Conference (2022, October)

Massive Multiple Input-Multiple Output (mMIMO) technique has been considered an efficient standard to improve the transmission rate significantly for the following wireless communication systems, such as ... [more ▼]

Massive Multiple Input-Multiple Output (mMIMO) technique has been considered an efficient standard to improve the transmission rate significantly for the following wireless communication systems, such as 5G and beyond. However, implementing this technology has been facing a critical issue of acquiring much channel state information. Primarily, this problem becomes more criticising in the integrated satellite and terrestrial networks (3GPP-Release 15) due to the countable high transmission delay. To deal with this challenging problem, the mMIMO-empowered non-coherent technique can be a promising solution. To our best knowledge, this paper is the first work considering employing the non-coherent mMIMO in satellite communication systems. This work aims to analyse the challenges and opportunities emerging with this integration. Moreover, we identified the issues in this conjunction. The preliminary results presented in this work show that the performance measured in bit error rate (BER) and the number of antennas are not far from that required for terrestrial links. Furthermore, thanks to mMIMO in conjunction with the non-coherent approach, we can work in a low signal-to-noise ratio (SNR) regime, which is an excellent advantage for satellite links. [less ▲]

Detailed reference viewed: 21 (2 UL)
Full Text
Peer Reviewed
See detailPEELER: Learning to Effectively Predict Flakiness without Running Tests
Qin, Yihao; Wang, Shangwen; Liu, Kui et al

in Proceedings of the 38th IEEE International Conference on Software Maintenance and Evolution (2022, October)

—Regression testing is a widely adopted approach to expose change-induced bugs as well as to verify the correctness/robustness of code in modern software development settings. Unfortunately, the ... [more ▼]

—Regression testing is a widely adopted approach to expose change-induced bugs as well as to verify the correctness/robustness of code in modern software development settings. Unfortunately, the occurrence of flaky tests leads to a significant increase in the cost of regression testing and eventually reduces the productivity of developers (i.e., their ability to find and fix real problems). State-of-the-art approaches leverage dynamic test information obtained through expensive re-execution of test cases to effectively identify flaky tests. Towards accounting for scalability constraints, some recent approaches have built on static test case features, but fall short on effectiveness. In this paper, we introduce PEELER, a new fully static approach for predicting flaky tests through exploring a representation of test cases based on the data dependency relations. The predictor is then trained as a neural network based model, which achieves at the same time scalability (because it does not require any test execution), effectiveness (because it exploits relevant test dependency features), and practicality (because it can be applied in the wild to find new flaky tests). Experimental validation on 17,532 test cases from 21 Java projects shows that PEELER outperforms the state-of-the-art FlakeFlagger by around 20 percentage points: we catch 22% more flaky tests while yielding 51% less false positives. Finally, in a live study with projects in-the-wild, we reported to developers 21 flakiness cases, among which 12 have already been confirmed by developers as being indeed flaky. [less ▲]

Detailed reference viewed: 38 (6 UL)
See detail„Denken-wie-üblich“: Eine Herausforderung für die kulturelle Vielfalt
Nonoa, Koku Gnatuloma UL

in JOURNAL geroRESEARCH (2022), 7

Der vorliegende Beitrag analysiert zunächst das Verhältnis zwischen dem „Denken-wie-üblich“ und dem klassischen Kulturbegriff mit Blick auf Repräsentations- und Identitätspolitik. Dann geht die Analyse ... [more ▼]

Der vorliegende Beitrag analysiert zunächst das Verhältnis zwischen dem „Denken-wie-üblich“ und dem klassischen Kulturbegriff mit Blick auf Repräsentations- und Identitätspolitik. Dann geht die Analyse auf die Herausforderungen des „Denkens-wie-üblich“ in der kulturellen Vielfalt ein. Schlie.lich wird herausgearbeitet, wie dies inDiskursen, Sprechakten und Narrativen zum Ausdruck kommt. [less ▲]

Detailed reference viewed: 23 (4 UL)
Full Text
Peer Reviewed
See detailAutomated, Cost-effective, and Update-driven App Testing
Ngo, Chanh Duc UL; Pastore, Fabrizio UL; Briand, Lionel UL

in ACM Transactions on Software Engineering and Methodology (2022), 31(4), 61

Apps’ pervasive role in our society led to the definition of test automation approaches to ensure their dependability. However, state-of-the-art approaches tend to generate large numbers of test inputs ... [more ▼]

Apps’ pervasive role in our society led to the definition of test automation approaches to ensure their dependability. However, state-of-the-art approaches tend to generate large numbers of test inputs and are unlikely to achieve more than 50% method coverage. In this paper, we propose a strategy to achieve significantly higher coverage of the code affected by updates with a much smaller number of test inputs, thus alleviating the test oracle problem. More specifically, we present ATUA, a model-based approach that synthesizes App models with static analysis, integrates a dynamically-refined state abstraction function, and combines complementary testing strategies, including (1) coverage of the model structure, (2) coverage of the App code, (3) random exploration, and (4) coverage of dependencies identified through information retrieval. Its model-based strategy enables ATUA to generate a small set of inputs that exercise only the code affected by the updates. In turn, this makes common test oracle solutions more cost-effective as they tend to involve human effort. A large empirical evaluation, conducted with 72 App versions belonging to nine popular Android Apps, has shown that ATUA is more effective and less effort-intensive than state-of-the-art approaches when testingApp updates. [less ▲]

Detailed reference viewed: 69 (20 UL)
Full Text
Peer Reviewed
See detailThe Wasserstein Impact Measure (WIM): A practical tool for quantifying prior impact in Bayesian statistics
Ley, Christophe UL; Ghaderinezhad, Fatemeh; Serrien, Ben

in Computational Statistics and Data Analysis (2022), 174

The prior distribution is a crucial building block in Bayesian analysis, and its choice will impact the subsequent inference. It is therefore important to have a convenient way to quantify this impact, as ... [more ▼]

The prior distribution is a crucial building block in Bayesian analysis, and its choice will impact the subsequent inference. It is therefore important to have a convenient way to quantify this impact, as such a measure of prior impact will help to choose between two or more priors in a given situation. To this end a new approach, the Wasserstein Impact Measure (WIM), is introduced. In three simulated scenarios, the WIM is compared to two competitor prior impact measures from the literature, and its versatility is illustrated via two real datasets. [less ▲]

Detailed reference viewed: 22 (1 UL)
Full Text
See detailOptimal coalition splitting with heterogenous strategies
Boucekkine, Raouf; Camcho, Carmen; Ruan, Weihua et al

E-print/Working paper (2022)

We consider a group of players initially members of a coalition managing cooperatively a public bad, in this case, the stock of pollution. Countries are technologically heterogeneous but the pollution ... [more ▼]

We consider a group of players initially members of a coalition managing cooperatively a public bad, in this case, the stock of pollution. Countries are technologically heterogeneous but the pollution damage is uniform. We essentially attempt to characterize the conditions under which a country may eventually split and when it splits within an infinite horizon multi-stage differential game. In contrast to the existing literature, we do not assume that after splitting, the splitting player and the remaining coalition will adopt Markovian strategies. Instead, we assume that the latter will remain committed to the collective control of pollution and play open-loop, while the splitting player plays Markovian. Within a full linear-quadratic model, we characterize the optimal strategies. We later compare with the outcomes of the case where the splitting player and the remaining coalition play both Markovian. We highlight several interesting results in terms of the implications for long- term pollution levels and the duration of coalitions with heterogeneous strategies. [less ▲]

Detailed reference viewed: 13 (0 UL)
Full Text
Peer Reviewed
See detailItem response theory and differential test functioning analysis of the HBSC-Symptom-Checklist across 46 countries
Heinz, Andreas UL; Sischka, Philipp UL; Catunda, Carolina UL et al

in BMC Medical Research Methodology (2022), 22(253),

Background The Symptom Checklist (SCL) developed by the Health Behaviour in School-aged Children (HBSC) study is a non-clinical measure of psychosomatic complaints (e.g., headache and feeling low) that ... [more ▼]

Background The Symptom Checklist (SCL) developed by the Health Behaviour in School-aged Children (HBSC) study is a non-clinical measure of psychosomatic complaints (e.g., headache and feeling low) that has been used in numerous studies. Several studies have investigated the psychometric characteristics of this scale; however, some psychometric properties remain unclear, among them especially a) dimensionality, b) adequacy of the Graded Response Model (GRM), and c) measurement invariance across countries. Methods Data from 229,906 adolescents aged 11, 13 and 15 from 46 countries that participated in the 2018 HBSC survey were analyzed. Adolescents were selected using representative sampling and surveyed by questionnaire in the classroom. Dimensionality was investigated using exploratory graph analysis. In addition, we investigated whether the GRM provided an adequate description of the data. Reliability over the latent variable continuum and differential test functioning across countries were also examined. Results Exploratory graph analyses showed that SCL can be considered as one-dimensional in 16 countries. However, a comparison of the unidimensional with a post-hoc bifactor GRM showed that deviation from a hypothesized one-dimensional structure was negligible in most countries. Multigroup invariance analyses supported configural and metric invariance, but not scalar invariance across 32 countries. Alignment analysis showed non-invariance especially for the items irritability, feeling nervous/bad temper and feeling low. Conclusion HBSC-SCL appears to represent a consistent and reliable unidimensional instrument across most countries. This bodes well for population health analyses that rely on this scale as an early indicator of mental health status. [less ▲]

Detailed reference viewed: 30 (4 UL)
Full Text
Peer Reviewed
See detailStretching directions in cislunar space: Applications for departures and transfer design
Muralidharan, Vivek UL; Howell, Kathleen C.

in Astrodynamics (2022)

Stable or nearly stable orbits do not generally possess well-distinguished manifold structures that assist in designing trajectories for departing from or arriving onto a periodic orbit. For some ... [more ▼]

Stable or nearly stable orbits do not generally possess well-distinguished manifold structures that assist in designing trajectories for departing from or arriving onto a periodic orbit. For some potential missions, the orbits of interest are selected as nearly stable to reduce the possibility of rapid departure. However, the linearly stable nature of these orbits is also a drawback for their timely insertion into or departure from the orbit. Stable or nearly stable Near Rectilinear Halo Orbits (NRHOs), Distant Retrograde Orbits (DROs), and lunar orbits offer potential long-horizon trajectories for exploration missions and demand efficient operations. The current investigation focuses on leveraging stretching directions as a tool for departure and trajectory design applications. The magnitude of the state variations along the maximum stretching direction is expected to grow rapidly and, therefore, offers information for efficient departure from the orbit. Similarly, maximum stretching in reverse time enables arrival with a minimal maneuver magnitude. [less ▲]

Detailed reference viewed: 14 (2 UL)
Full Text
See detailTRANSFORMING DATA PREPROCESSING: A HOLISTIC, NORMALIZED AND DISTRIBUTED APPROACH
Tawakuli, Amal UL

Doctoral thesis (2022)

Substantial volumes of data are generated at the edge as a result of an exponential increase in the number of Internet of Things (IoT) applications. IoT data are generated at edge components and, in most ... [more ▼]

Substantial volumes of data are generated at the edge as a result of an exponential increase in the number of Internet of Things (IoT) applications. IoT data are generated at edge components and, in most cases, transmitted to central or cloud infrastructures via the network. Distributing data preprocessing to the edge and closer to the data sources would address issues found in the data early in the pipeline. Thus, distribution prevents error propagation, removes redundancies, minimizes privacy leakage and optimally summarizes the information contained in the data prior to transmission. This in turn, prevents wasting valuable yet limited resources at the edge, which would otherwise be used for transmitting data that may contain anomalies and redundancies. New legal requirements such the GDPR and ethical responsibilities render data preprocessing, which addresses these emerging topics, urgent especially at the edge prior to the data leaving the premises of data owners. This PhD dissertation is divided into two parts that focus on two main directions within data preprocessing. The first part focuses on structuring and normalizing the data preprocessing design phase for AI applications. This involved an extensive and comprehensive survey of data preprocessing techniques coupled with an empirical analysis. From the survey, we introduced a holistic and normalized definition and scope of data preprocessing. We also identified the means of generalizing data preprocessing by abstracting preprocessing techniques into categories and sub-categories. Our survey and empirical analysis highlighted dependencies and relationships between the different categories and sub-categories, which determine the order of execution within preprocessing pipelines. The identified categories, sub-categories and their dependencies were assembled into a novel data preprocessing design tool that is a template from which application and dataset specific preprocessing plans and pipelines are derived. The design tool is agnostic to datasets and applications and is a crucial step towards normalizing, regulating and structuring the design of data preprocessing pipelines. The tool helps practitioners and researchers apply a modern take on data preprocessing that enhances the reproducibility of preprocessed datasets and addresses a broader spectrum of issues in the data. The second part of the dissertation focuses on leveraging edge computing within an IoT context to distribute data preprocessing at the edge. We empirically evaluated the feasibility of distributing data preprocessing techniques from different categories and assessed the impact of the distribution including on the consumption of different resources such as time, storage, bandwidth and energy. To perform the distribution, we proposed a collaborative edge-cloud framework dedicated to data preprocessing with two main mechanisms that achieve synchronization and coordination. The synchronization mechanism is an Over-The-Air (OTA) updating mechanism that remotely pushes updated preprocessing plans to the different edge components in response to changes in user requirements or the evolution of data characteristics. The coordination mechanism is a resilient and progressive execution mechanism that leverages the Directed Acyclic Graph (DAG) to represent the data preprocessing plans. Distributed preprocessing plans are shared between different cloud and edge components and are progressively executed while adhering to the topological order dictated by the DAG representation. To empirically test our proposed solutions, we developed a prototype, named DeltaWing, of our edge-cloud collaborative data preprocessing framework that consists of three stages; one central stage and two edge stages. A use-case was also designed based on a dataset obtained from Honda Research Institute US. Using DeltaWing and the use-case, we simulated an Automotive IoT application to evaluate our proposed solutions. Our empirical results highlight the effectiveness and positive impact of our framework in reducing the consumption of valuable resources (e.g., ≈ 57% reduction in bandwidth usage) at the edge while retaining information (prediction accuracy) and maintaining operational integrity. The two parts of the dissertation are interconnected yet can exist independently. Their contributions combined, constitute a generic toolset for the optimization of the data preprocessing phase. [less ▲]

Detailed reference viewed: 31 (3 UL)
See detailL'impact de la technologie blockchain sur le droit
Becker, Katrin UL

Presentation (2022, September 22)

Detailed reference viewed: 15 (0 UL)
See detailJewish Studies in the Digital Age
Zaagsma, Gerben UL; Stökl Ben Ezra, Daniel; Rürup, Miriam et al

Book published by De Gruyter Oldenbourg (2022)

As in all fields and disciplines of the humanities, Jewish Studies scholars find themselves confronted with the rapidly increasing availability of digital resources (data), new technologies to interrogate ... [more ▼]

As in all fields and disciplines of the humanities, Jewish Studies scholars find themselves confronted with the rapidly increasing availability of digital resources (data), new technologies to interrogate and analyze them (tools), and the question of how to critically engage with these developments. This volume discusses how the digital turn has affected the field of Jewish Studies. It explores the current state of the art and probes how digital developments can be harnessed to address the specific questions, challenges and problems that Jewish Studies scholars confront. In a field characterised by dispersed sources, and heterogeneous scripts and languages that speak to a multitude of cultures and histories, of abundance as well as loss, what is the promise of Digital Humanities methods--and what are the challenges and pitfalls? The articles in this volume were originally presented at the international conference #DHJewish - Jewish Studies in the Digital Age, which was organised at the Centre for Contemporary and Digital History (C²DH) at University of Luxembourg in January 2021. The first big international conference of its kind, it brought together more than sixty scholars and heritage practitioners to discuss how the digital turn affects the field of Jewish Studies. [less ▲]

Detailed reference viewed: 19 (3 UL)
Full Text
See detailJewish Studies in the Digital Age: Introduction
Zaagsma, Gerben UL; Stökl Ben Ezra, Daniel; Rürup, Miriam et al

in Levi, Amalia S.; Zaagsma, Gerben; Stökl Ben Ezra, Daniel (Eds.) et al Jewish Studies in the Digital Age (2022)

Detailed reference viewed: 15 (0 UL)
Full Text
Peer Reviewed
See detailMethods for increasing the dependability of High-performance, Many-core, System-on-Chips
Graczyk, Rafal UL; Memon, Md Saad UL; Volp, Marcus UL

in Graczyk, Rafal; Memon, Md Saad; Volp, Marcus (Eds.) IAC 2022 congress proceedings, 73rd International Astronautical Congress (IAC) (2022, September 21)

Future space exploration and exploitation missions will require significantly increased autonomy of operation for mission planning, decision-making, and adaptive control techniques. Spacecrafts will ... [more ▼]

Future space exploration and exploitation missions will require significantly increased autonomy of operation for mission planning, decision-making, and adaptive control techniques. Spacecrafts will integrate new processing and compression algorithms that are often augmented with machine learning and artificial intelligence capabilities. This functionality will have to be provided with high levels of robustness, reliability, and dependability for conducting missions successfully. High-reliability requirements for space-grade processors have led to trade-offs in terms of costs, energy efficiency, and performance to obtain robustness. However, while high-performance / low-robustness configurations are acceptable in the Earth's vicinity, where assets remain protected by the planet's magnetosphere, they cease to work in more demanding environments, like cis-lunar or deep space, where high-energy particles will affect modern components heavily, causing temporary or permanent damage and ultimately system failures. The above has led to a situation where state-of-the-art processing elements (processors, co-processors, memories, special purpose accelerators, and field-programmable-gate arrays (FPGAs), all possibly integrated into System-on-a-Chip (SoC) designs) are superior to their high reliability, space-qualified counterparts in terms of processing power or energy efficiency. For example, from modern, state-of-the-art (SOTA) devices, one can expect a 2-3 order-of-magnitude performance per Watts improvement over space-grade equipment. Likewise, one finds a gap of approximately nine technology nodes between devices, which translates into a factor 25 decrease in operations per Watts. In this paper, we demonstrate how to utilize part of this enormous performance advantage to increase the robustness and resilience of otherwise susceptible semiconductor devices while harnessing the remaining processing power to build affordable space systems capable of hosting the compute-intensive functionality that future space missions require. We are bridging this performance-reliability gap by researching the enabling building blocks for constructing reliable and secure, space-ready Systems-on-a-Chip from SOTA processing elements. [less ▲]

Detailed reference viewed: 30 (5 UL)
Full Text
See detailHardware-in-the-loop Proximity Operations in Cislunar Space
Muralidharan, Vivek UL; Makhdoomi, Mohatashem Reyaz UL; Barad, Kuldeep Rambhai UL et al

Scientific Conference (2022, September 20)

Space missions to Near Rectilinear Halo Orbits (NRHOs) in the Earth-Moon system are upcoming. A rendezvous technique in the cislunar space is proposed in this investigation, one that leverages coupled ... [more ▼]

Space missions to Near Rectilinear Halo Orbits (NRHOs) in the Earth-Moon system are upcoming. A rendezvous technique in the cislunar space is proposed in this investigation, one that leverages coupled orbit and attitude dynamics in the Circular Restricted Three-body Problem (CR3BP). An autonomous Guidance, Navigation and Control (GNC) technique is demonstrated in which a chaser spacecraft approaches a target spacecraft in the southern 9:2 synodic-resonant L2 Near Rectilinear Halo Orbit (NRHO), one that currently serves as the baseline for NASA's Gateway. A two-layer control approach is contemplated. First, a nonlinear optimal controller identifies an appropriate baseline rendezvous path, both in position and orientation. As the spacecraft progresses along the pre-computed baseline path, optical sensors measure the relative pose of the chaser relative to the target. A Kalman filter processes these observations and offers precise state estimates. A linear controller compensates for any deviations identified from the predetermined rendezvous path. The efficacy of the GNC technique is tested by considering a complex scenario in which the rendezvous operation is conducted with a non-cooperative tumbling target. Hardware-in-the-loop laboratory experiments are conducted as proof-of-concept to validate the guidance algorithm, with observations supplemented by optical navigation techniques. [less ▲]

Detailed reference viewed: 56 (29 UL)
Full Text
Peer Reviewed
See detailMachine learning applied to higher order functional representations of omics data reveals biological pathways associated with Parkinson‘s Disease
Gómez de Lope, Elisa UL; Glaab, Enrico UL

Poster (2022, September 18)

Background: Despite the increasing prevalence of Parkinson’s Disease (PD) and research efforts to understand its underlying molecular pathogenesis, early diagnosis of PD remains a challenge. Machine ... [more ▼]

Background: Despite the increasing prevalence of Parkinson’s Disease (PD) and research efforts to understand its underlying molecular pathogenesis, early diagnosis of PD remains a challenge. Machine learning analysis of blood-based omics data is a promising non-invasive approach to finding molecular fingerprints associated with PD that may enable an early and accurate diagnosis. Description: We applied several machine learning classification methods to public omics data from PD case/control studies. We used aggregation statistics and Pathifier’s pathway deregulation scores to generate higher order functional representations of the data such as pathway-level features. The models’ performance and most relevant predictive features were compared with individual feature level predictors. The resulting diagnostic models from individual features and Pathifier’s pathway deregulation scores achieve significant Area Under the Curve (AUC, a receiver operating characteristic curve) scores for both cross-validation and external testing. Furthermore, we identify plausible biological pathways associated with PD diagnosis. Conclusions: We have successfully built machine learning models at pathway-level and single-feature level to study blood-based omics data for PD diagnosis. Plausible biological pathway associations were identified. Furthermore, we show that pathway deregulation scores can serve as robust and biologically interpretable predictors for PD. [less ▲]

Detailed reference viewed: 68 (5 UL)
Full Text
Peer Reviewed
See detailDigital History and the Politics of Digitization
Zaagsma, Gerben UL

in Digital Scholarship in the Humanities (2022)

Much has been made in recent years of the transformative potential of digital resources and historical data for historical research. Historians seem to be flooded with retro-digitized and born-digital ... [more ▼]

Much has been made in recent years of the transformative potential of digital resources and historical data for historical research. Historians seem to be flooded with retro-digitized and born-digital materials and tend to take these for granted, grateful for the opportunities they afford. In a research environment that increasingly privileges what is available online, the questions of why, where, and how we can access what we can access, and how it affects historical research have become ever more urgent. This article proposes a framework through which to contextualize the politics of (digital) heritage preservation, and a model to analyze its most important political dimensions, drawing upon literature from the digital humanities & history as well as archival, library and information science. The first part will outline the global dimensions of the politics of digital cultural heritage, focusing on developments between and within the Global North and South, framed within the broader context of the politics of heritage and its preservation. The second part surveys the history and current state of digitization and offers a structured analysis of the process of digitization and its political dimensions. Choices and decisions about selection for digitization, how to catalogue, classify and what metadata to add are all political in nature and have political consequences, and the same is true for access. The article concludes with several recommendations and a plea to acknowledge the importance of digital cataloguing in enabling access to the global human record. [less ▲]

Detailed reference viewed: 57 (1 UL)