Last 7 days
Bookmark and Share    
Full Text
See detailHardware-in-the-loop Proximity Operations in Cislunar Space
Muralidharan, Vivek UL; Makhdoomi, Mohatashem Reyaz UL; Barad, Kuldeep Rambhai UL et al

Scientific Conference (2022, September 20)

Space missions to Near Rectilinear Halo Orbits (NRHOs) in the Earth-Moon system are upcoming. A rendezvous technique in the cislunar space is proposed in this investigation, one that leverages coupled ... [more ▼]

Space missions to Near Rectilinear Halo Orbits (NRHOs) in the Earth-Moon system are upcoming. A rendezvous technique in the cislunar space is proposed in this investigation, one that leverages coupled orbit and attitude dynamics in the Circular Restricted Three-body Problem (CR3BP). An autonomous Guidance, Navigation and Control (GNC) technique is demonstrated in which a chaser spacecraft approaches a target spacecraft in the southern 9:2 synodic-resonant L2 Near Rectilinear Halo Orbit (NRHO), one that currently serves as the baseline for NASA's Gateway. A two-layer control approach is contemplated. First, a nonlinear optimal controller identifies an appropriate baseline rendezvous path, both in position and orientation. As the spacecraft progresses along the pre-computed baseline path, optical sensors measure the relative pose of the chaser relative to the target. A Kalman filter processes these observations and offers precise state estimates. A linear controller compensates for any deviations identified from the predetermined rendezvous path. The efficacy of the GNC technique is tested by considering a complex scenario in which the rendezvous operation is conducted with a non-cooperative tumbling target. Hardware-in-the-loop laboratory experiments are conducted as proof-of-concept to validate the guidance algorithm, with observations supplemented by optical navigation techniques. [less ▲]

Detailed reference viewed: 59 (30 UL)
Full Text
Peer Reviewed
See detailDigital History and the Politics of Digitization
Zaagsma, Gerben UL

in Digital Scholarship in the Humanities (2022)

Much has been made in recent years of the transformative potential of digital resources and historical data for historical research. Historians seem to be flooded with retro-digitized and born-digital ... [more ▼]

Much has been made in recent years of the transformative potential of digital resources and historical data for historical research. Historians seem to be flooded with retro-digitized and born-digital materials and tend to take these for granted, grateful for the opportunities they afford. In a research environment that increasingly privileges what is available online, the questions of why, where, and how we can access what we can access, and how it affects historical research have become ever more urgent. This article proposes a framework through which to contextualize the politics of (digital) heritage preservation, and a model to analyze its most important political dimensions, drawing upon literature from the digital humanities & history as well as archival, library and information science. The first part will outline the global dimensions of the politics of digital cultural heritage, focusing on developments between and within the Global North and South, framed within the broader context of the politics of heritage and its preservation. The second part surveys the history and current state of digitization and offers a structured analysis of the process of digitization and its political dimensions. Choices and decisions about selection for digitization, how to catalogue, classify and what metadata to add are all political in nature and have political consequences, and the same is true for access. The article concludes with several recommendations and a plea to acknowledge the importance of digital cataloguing in enabling access to the global human record. [less ▲]

Detailed reference viewed: 57 (1 UL)
Full Text
See detailModel-based Specification and Analysis of Natural Language Requirements in the Financial Domain
Veizaga Campero, Alvaro Mario UL

Doctoral thesis (2022)

Software requirements form an important part of the software development process. In many software projects conducted by companies in the financial sector, analysts specify software requirements using a ... [more ▼]

Software requirements form an important part of the software development process. In many software projects conducted by companies in the financial sector, analysts specify software requirements using a combination of models and natural language (NL). Neither models nor NL requirements provide a complete picture of the information in the software system, and NL is highly prone to quality issues, such as vagueness, ambiguity, and incompleteness. Poorly written requirements are difficult to communicate and reduce the opportunity to process requirements automatically, particularly the automation of tedious and error-prone tasks, such as deriving acceptance criteria (AC). AC are conditions that a system must meet to be consistent with its requirements and be accepted by its stakeholders. AC are derived by developers and testers from requirement models. To obtain a precise AC, it is necessary to reconcile the information content in NL requirements and the requirement models. In collaboration with an industrial partner from the financial domain, we first systematically developed and evaluated a controlled natural language (CNL) named Rimay to help analysts write functional requirements. We then proposed an approach that detects common syntactic and semantic errors in NL requirements. Our approach suggests Rimay patterns to fix errors and convert NL requirements into Rimay requirements. Based on our results, we propose a semiautomated approach that reconciles the content in the NL requirements with that in the requirement models. Our approach helps modelers enrich their models with information extracted from NL requirements. Finally, an existing test-specification derivation technique was applied to the enriched model to generate AC. The first contribution of this dissertation is a qualitative methodology that can be used to systematically define a CNL for specifying functional requirements. This methodology was used to create Rimay, a CNL grammar, to specify functional requirements. This CNL was derived after an extensive qualitative analysis of a large number of industrial requirements and by following a systematic process using lexical resources. An empirical evaluation of our CNL (Rimay) in a realistic setting through an industrial case study demonstrated that 88% of the requirements used in our empirical evaluation were successfully rephrased using Rimay. The second contribution of this dissertation is an automated approach that detects syntactic and semantic errors in unstructured NL requirements. We refer to these errors as smells. To this end, we first proposed a set of 10 common smells found in the NL requirements of financial applications. We then derived a set of 10 Rimay patterns as a suggestion to fix the smells. Finally, we developed an automatic approach that analyzes the syntax and semantics of NL requirements to detect any present smells and then suggests a Rimay pattern to fix the smell. We evaluated our approach using an industrial case study that obtained promising results for detecting smells in NL requirements (precision 88%) and for suggesting Rimay patterns (precision 89%). The last contribution of this dissertation was prompted by the observation that a reconciliation of the information content in the NL requirements and the associated models is necessary to obtain precise AC. To achieve this, we define a set of 13 information extraction rules that automatically extract AC-related information from NL requirements written in Rimay. Next, we propose a systematic method that generates recommendations for model enrichment based on the information extracted from the 13 extraction rules. Using a real case study from the financial domain, we evaluated the usefulness of the AC-related model enrichments recommended by our approach. The domain experts found that 89% of the recommended enrichments were relevant to AC, but absent from the original model (precision of 89%). [less ▲]

Detailed reference viewed: 19 (5 UL)
See detailDecentralization as Disembodiment. Blockchain Justice between Utopia and Myopia
Becker, Katrin UL

Presentation (2022, September 15)

Detailed reference viewed: 6 (0 UL)
See detailEpilogue: Darning the Divide - Thinking Counterfactually
Becker, Katrin UL; Lassègue, Jean

Presentation (2022, September 15)

Detailed reference viewed: 7 (0 UL)
See detailFNR Highlight: CeMi: Cemeteries & crematoria as public spaces of belonging in Europe.
Kmec, Sonja UL

Diverse speeches and writings (2022)

Detailed reference viewed: 109 (0 UL)
Full Text
Peer Reviewed
See detailInstitutional determinants of intersectional inequalities in science
Kozlowski, Diego UL; Larivière, Vincent; Sugimoto, Cassidy R. et al

in BRIDGES BETWEEN DISCIPLINES: GENDER IN STEM AND SOCIAL SCIENCES (2022, September 12)

Detailed reference viewed: 16 (0 UL)
Peer Reviewed
See detailAutomatic Classification of Peer Review Recommendation
Kozlowski, Diego UL; Boothby, Clara; Pei-Ying, Chen et al

Poster (2022, September 08)

Detailed reference viewed: 21 (0 UL)
Full Text
See detailA cosmopolitan international law: the authority of regional inter-governmental organisations to establish international criminal accountability mechanisms
Owiso, Owiso UL

Doctoral thesis (2022)

The overall aim of this thesis is to investigate the potential role of regional inter-governmental organisations (RIGOs) in international criminal accountability, specifically through the establishment of ... [more ▼]

The overall aim of this thesis is to investigate the potential role of regional inter-governmental organisations (RIGOs) in international criminal accountability, specifically through the establishment of criminal accountability mechanisms, and to make a case for RIGOs’ active involvement. The thesis proceeds from the assumption that international criminal justice is a cosmopolitan project that demands that a tenable conception of state sovereignty guarantees humanity’s fundamental values, specifically human dignity. Since cosmopolitanism emphasises the equality and unity of the human family, guaranteeing the dignity and humanity of the human family is therefore a common interest of humanity rather than a parochial endeavour. Accountability for international crimes is one way through which human dignity can be validated and reaffirmed where such dignity has been grossly and systematically assaulted. Therefore, while accountability for international crimes is primarily the obligation of individual sovereign states, this responsibility is ultimately residually one of humanity as a whole, exercisable through collective action. As such, the thesis advances the argument that states as collective representations of humanity have a responsibility to assist in ensuring accountability for international crimes where an individual state is either genuinely unable or unwilling by itself to do so. The thesis therefore addresses the question as to whether RIGOs, as collective representations of states and their peoples, can establish international criminal accountability mechanisms. Relying on cosmopolitanism as a theoretical underpinning, the thesis examines the exercise of what can be considered as elements of sovereign authority by RIGOs in pursuit of the cosmopolitan objective of accountability for international crimes. In so doing, the thesis interrogates whether there is a basis in international law for such engagement, and examines how such engagement can practically be undertaken, using two case studies of the European Union and the Kosovo Specialist Chambers and Specialist Prosecutor’s Office, and the African Union and the (proposed) Hybrid Court for South Sudan. The thesis concludes that general international law does not preclude RIGOs from exercising elements of sovereign authority necessary for the establishment of international criminal accountability mechanisms, and that specific legal authority to engage in this regard can then be determined by reference to the doctrine of attributed/conferred powers and the doctrine of implied powers in interpreting the legal instruments of RIGOs. Based on this conclusion, the thesis makes a normative case for an active role for RIGOs in the establishment of international criminal accountability mechanisms, and provides a practical step-by-step guide on possible legal approaches for the establishment of such mechanisms by RIGOs, as well as guidance on possible design models for these mechanisms. [less ▲]

Detailed reference viewed: 18 (4 UL)
Full Text
Peer Reviewed
See detailAbstracts of the 11th DACH+ Conference on Energy Informatics (S53-Taxonomy of Local Flexibility Markets)
Potenciano Menci, Sergio UL

in Energy Informatics (2022, September 07), 5

Flexibility has risen as a potential solution and complement for system operators’ current and future problems (e.g., congestion, voltage) caused by integrating distributed renewable resources (e.g., wind ... [more ▼]

Flexibility has risen as a potential solution and complement for system operators’ current and future problems (e.g., congestion, voltage) caused by integrating distributed renewable resources (e.g., wind, solar) and electric vehicles. In parallel, local flexibility markets (LFM) emerge as a possible smart grid solution to bridge between flexibility-seeking customers and flexibility-offering customers in localized areas. Nevertheless, there is no unique, standard, or simple solution to tackle all the problems system operators and other energy actors face. Therefore, many local flexibility market concepts, initiatives (projects), and companies have developed various solutions over the last few years. At the same time, they increased the complexity of the topic. Thus, this research paper aims to describe several local flexibility market concepts, initiatives (projects), and companies in Europe. To do so, we propose a taxonomy derived from LFMs descriptions. We use the taxonomy-building research method proposed by [1] to develop our taxonomy. Moreover, we use the smart grid architecture model (SGAM) as a structural and foundation guideline. Given the numerous and diverse LFM solutions, we delimit the taxonomy by considering solutions focused on congestion management on medium and low voltage (meta-characteristic). [less ▲]

Detailed reference viewed: 28 (0 UL)
Full Text
Peer Reviewed
See detailOptimal industrial flexibility scheduling based on generic data format
Bahmani, Ramin UL; van Stiphoudt, Christine UL; Potenciano Menci, Sergio UL et al

in Energy Informatics (2022, September 07), 5

The energy transition into a modern power system requires energy flexibility. Demand Response (DR) is one promising option for providing this flexibility. With the highest share of final energy ... [more ▼]

The energy transition into a modern power system requires energy flexibility. Demand Response (DR) is one promising option for providing this flexibility. With the highest share of final energy consumption, the industry has the potential to offer DR and contribute to the energy transition by adjusting its energy demand. This paper proposes a mathematical optimization model that uses a generic data model for flexibility description. The optimization model supports industrial companies to select when (i.e., at which time), where (i.e., in which market), and how (i.e., the schedule) they should market their flexibility potential to optimize profit. We evaluate the optimization model under several synthetic use cases developed upon the learnings over several workshops and bilateral discussions with industrial partners from the paper and aluminum industry. The results of the optimization model evaluation suggest the model can fulfill its purpose under different use cases even with complex use cases such as various loads and storages. However, the optimization model computation time grows as the complexity of use cases grows. [less ▲]

Detailed reference viewed: 16 (1 UL)
Full Text
See detailWCET and Priority Assignment Analysis of Real-Time Systems using Search and Machine Learning
Lee, Jaekwon UL

Doctoral thesis (2022)

Real-time systems have become indispensable for human life as they are used in numerous industries, such as vehicles, medical devices, and satellite systems. These systems are very sensitive to violations ... [more ▼]

Real-time systems have become indispensable for human life as they are used in numerous industries, such as vehicles, medical devices, and satellite systems. These systems are very sensitive to violations of their time constraints (deadlines), which can have catastrophic consequences. To verify whether the systems meet their time constraints, engineers perform schedulability analysis from early stages and throughout development. However, there are challenges in obtaining precise results from schedulability analysis due to estimating the worst-case execution times (WCETs) and assigning optimal priorities to tasks. Estimating WCET is an important activity at early design stages of real-time systems. Based on such WCET estimates, engineers make design and implementation decisions to ensure that task executions always complete before their specified deadlines. However, in practice, engineers often cannot provide a precise point of WCET estimates and they prefer to provide plausible WCET ranges. Task priority assignment is an important decision, as it determines the order of task executions and it has a substantial impact on schedulability results. It thus requires finding optimal priority assignments so that tasks not only complete their execution but also maximize the safety margins from their deadlines. Optimal priority values increase the tolerance of real-time systems to unexpected overheads in task executions so that they can still meet their deadlines. However, it is a hard problem to find optimal priority assignments because their evaluation relies on uncertain WCET values and complex engineering constraints must be accounted for. This dissertation proposes three approaches to estimate WCET and assign optimal priorities at design stages. Combining a genetic algorithm and logistic regression, we first suggest an automatic approach to infer safe WCET ranges with a probabilistic guarantee based on the worst-case scheduling scenarios. We then introduce an extended approach to account for weakly hard real-time systems with an industrial schedule simulator. We evaluate our approaches by applying them to industrial systems from different domains and several synthetic systems. The results suggest that they are possible to estimate probabilistic safe WCET ranges efficiently and accurately so the deadline constraints are likely to be satisfied with a high degree of confidence. Moreover, we propose an automated technique that aims to identify the best possible priority assignments in real-time systems. The approach deals with multiple objectives regarding safety margins and engineering constraints using a coevolutionary algorithm. Evaluation with synthetic and industrial systems shows that the approach significantly outperforms both a baseline approach and solutions defined by practitioners. All the solutions in this dissertation scale to complex industrial systems for offline analysis within an acceptable time, i.e., at most 27 hours. [less ▲]

Detailed reference viewed: 33 (6 UL)
Full Text
See detailMulti-objective Robust Machine Learning For Critical Systems With Scarce Data
Ghamizi, Salah UL

Doctoral thesis (2022)

With the heavy reliance on Information Technologies in every aspect of our daily lives, Machine Learning (ML) models have become a cornerstone of these technologies’ rapid growth and pervasiveness. In ... [more ▼]

With the heavy reliance on Information Technologies in every aspect of our daily lives, Machine Learning (ML) models have become a cornerstone of these technologies’ rapid growth and pervasiveness. In particular, the most critical and fundamental technologies that handle our economic systems, transportation, health, and even privacy. However, while these systems are becoming more effective, their complexity inherently decreases our ability to understand, test, and assess the dependability and trustworthiness of these systems. This problem becomes even more challenging under a multi-objective framework: When the ML model is required to learn multiple tasks together, behave under constrained inputs or fulfill contradicting concomitant objectives. Our dissertation focuses on the context of robust ML under limited training data, i.e., use cases where it is costly to collect additional training data and/or label it. We will study this topic under the prism of three real use cases: Fraud detection, pandemic forecasting, and chest x-ray diagnosis. Each use-case covers one of the challenges of robust ML with limited data, (1) robustness to imperceptible perturbations, or (2) robustness to confounding variables. We provide a study of the challenges for each case and propose novel techniques to achieve robust learning. As the first contribution of this dissertation, we collaborate with BGL BNP Paribas. We demonstrate that their overdraft and fraud detection systems are prima facie robust to adversarial attacks because of the complexity of their feature engineering and domain constraints. However, we show that gray-box attacks that take into account domain knowledge can easily break their defense. We propose, CoEva2 adversarial fine-tuning, a new defense mechanism based on multi-objective evolutionary algorithms to augment the training data and mitigate the system’s vulnerabilities. Next, we investigate how domain knowledge can protect against adversarial attacks through multi-task learning. We show that adding domain constraints in the form of additional tasks can significantly improve the robustness of models to adversarial attacks, particularly for the robot navigation use case. We propose a new set of adaptive attacks and demonstrate that adversarial training combined with such attacks can improve robustness. While the raw data available in the BGL or Robot Navigation is vast, it is heavily cleaned, feature-engineered, and annotated by domain experts (which are expensive), and the end training data is scarce. In contrast, raw data is scarce when dealing with an outbreak, and designing robust ML systems to predict, forecast, and recommend mitigation policies is challenging. In particular, for small countries like Luxembourg. Contrary to common techniques that forecast new cases based on previous data in time series, we propose a novel surrogate-based optimization as an integrated loop. It combines a neural network prediction of the infection rate based on mobility attributes and a model-based simulation that predicts the cases and deaths. Our approach has been used by the Luxembourg government’s task force and has been recognized with a best paper award at KDD2020. Our following work focuses on the challenges that pose cofounding factors to the robustness and generalization of Chest X-ray (CXR) classification. We first investigate the robustness and generalization of multi-task models, then demonstrate that multi-task learning, leveraging the cofounding variables, can significantly improve the generalization and robustness of CXR classification models. Our results suggest that task augmentation with additional knowledge (like extraneous variables) outperforms state-of-art data augmentation techniques in improving test and robust performances. Overall, this dissertation provides insights into the importance of domain knowledge in the robustness and generalization of models. It shows that instead of building data-hungry ML models, particularly for critical systems, a better understanding of the system as a whole and its domain constraints yields improved robustness and generalization performances. This dissertation also proposes theorems, algorithms, and frameworks to effectively assess and improve the robustness of ML systems for real-world cases and applications. [less ▲]

Detailed reference viewed: 21 (2 UL)