Results 1-20 of 118.
((researchcenter:SnT) AND (datepublished:2021)) AND ((filter:inst))

Bookmark and Share    
Full Text
Peer Reviewed
See detailTo verify or tolerate, that’s the question
Pinto Gouveia, Ines UL; Sakr, Mouhammad UL; Graczyk, Rafal UL et al

Scientific Conference (2021, December 06)

Formal verification carries the promise of absolute correctness, guaranteed at the highest level of assurance known today. However, inherent to many verification attempts is the assumption that the ... [more ▼]

Formal verification carries the promise of absolute correctness, guaranteed at the highest level of assurance known today. However, inherent to many verification attempts is the assumption that the underlying hardware, the code-generation toolchain and the verification tools are correct, all of the time. While this assumption creates interesting recursive verification challenges, which already have been executed successfully for all three of these elements, the coverage of this assumption remains incomplete, in particular for hardware. Accidental faults, such as single-event upsets, transistor aging and latchups keep causing hardware to behave arbitrarily in situations where such events occur and require other means (e.g., tolerance) to safely operate through them. Targeted attacks, especially physical ones, have a similar potential to cause havoc. Moreover, faults of the above kind may well manifest in such a way that their effects extend to all software layers, causing incorrect behavior, even in proven correct ones. In this position paper, we take a holistic system-architectural point of view on the role of trusted-execution environments (TEEs), their implementation complexity and the guarantees they can convey and that we want to be preserved in the presence of faults. We find that if absolute correctness should remain our visionary goal, TEEs can and should be constructed differently with tolerance embedded at the lowest levels and with verification playing an essential role. Verification should both assure the correctness of the TEE construction protocols and mechanisms as well as help protecting the applications executing inside the TEEs. [less ▲]

Detailed reference viewed: 66 (2 UL)
Full Text
Peer Reviewed
See detailSelf-Sovereign Identity for the Financial Sector: A Case Study of PayString Service
Scheidt de Cristo, Flaviene UL; Shbair, Wazen UL; Trestioreanu, Lucian Andrei UL et al

Scientific Conference (2021, December 06)

PayString is an initiative to make payment identifiers global and human-readable, facilitating the exchange of payment information. However, the reference implementation lacks privacy and security ... [more ▼]

PayString is an initiative to make payment identifiers global and human-readable, facilitating the exchange of payment information. However, the reference implementation lacks privacy and security features, making it possible for anyone to access the payment information as long as the PayString identifier is known. Also this paper presents the first performance evaluation of PayString. Via a large-scale testbed our experimental results show an overhead which, given the privacy and security advantages offered, is acceptable in practice, thus making the proposed solution feasible. [less ▲]

Detailed reference viewed: 62 (5 UL)
Full Text
Peer Reviewed
See detailLearning-Assisted User Clustering in Cell-Free Massive MIMO-NOMA Networks
Le, Quang Nhat; Nguyen, van Dinh UL; Dobre, Octavia A. et al

in IEEE Transactions on Vehicular Technology (2021), 70(12), 12872-12887

The superior spectral efficiency (SE) and user fairness feature of non-orthogonal multiple access (NOMA) systems are achieved by exploiting user clustering (UC) more efficiently. However, a random UC ... [more ▼]

The superior spectral efficiency (SE) and user fairness feature of non-orthogonal multiple access (NOMA) systems are achieved by exploiting user clustering (UC) more efficiently. However, a random UC certainly results in a suboptimal solution while an exhaustive search method comes at the cost of high complexity, especially for systems of medium-to-large size. To address this problem, we develop two efficient unsupervised machine learning based UC algorithms, namely k-means++ and improved k-means++, to effectively cluster users into disjoint clusters in cell-free massive multiple-input multiple-output (CFmMIMO) system. Adopting full-pilot zero-forcing at access points (APs) to comprehensively assess the system performance, we formulate the sum SE optimization problem taking into account power constraints at APs, necessary conditions for implementing successive interference cancellation, and required SE constraints at user equipments. The formulated optimization problem is highly non-convex, and thus, it is difficult to obtain the global optimal solution. Therefore, we develop a simple yet efficient iterative algorithm for its solution. In addition, the performance of collocated massive MIMO-NOMA (COmMIMO-NOMA) system is also characterized. Numerical results are provided to show the superior performance of the proposed UC algorithms compared to baseline schemes. The effectiveness of applying NOMA in CFmMIMO and COmMIMO systems is also validated. [less ▲]

Detailed reference viewed: 99 (19 UL)
Full Text
Peer Reviewed
See detailWhat's in a Cyber Threat Intelligence sharing platform?: A mixed-methods user experience investigation of MISP
Stojkovski, Borce UL; Lenzini, Gabriele UL; Koenig, Vincent UL et al

in Annual Computer Security Applications Conference (ACSAC ’21) (2021, December)

The ever-increasing scale and complexity of cyber attacks and cyber-criminal activities necessitate secure and effective sharing of cyber threat intelligence (CTI) among a diverse set of stakeholders and ... [more ▼]

The ever-increasing scale and complexity of cyber attacks and cyber-criminal activities necessitate secure and effective sharing of cyber threat intelligence (CTI) among a diverse set of stakeholders and communities. CTI sharing platforms are becoming indispensable tools for cooperative and collaborative cybersecurity. Nevertheless, despite the growing research in this area, the emphasis is often placed on the technical aspects, incentives, or implications associated with CTI sharing, as opposed to investigating challenges encountered by users of such platforms. To date, user experience (UX) aspects remain largely unexplored. This paper offers a unique contribution towards understanding the constraining and enabling factors of security information sharing within one of the leading platforms. MISP is an open source CTI sharing platform used by more than 6,000 organizations worldwide. As a technically-advanced CTI sharing platform it aims to cater for a diverse set of security information workers with distinct needs and objectives. In this respect, MISP has to pay an equal amount of attention to the UX in order to maximize and optimize the quantity and quality of threat information that is contributed and consumed. Using mixed methods we shed light on the strengths and weaknesses of MISP from an end-users’ perspective and discuss the role UX could play in effective CTI sharing. We conclude with an outline of future work and open challenges worth further exploring in this nascent, yet highly important socio-technical context. [less ▲]

Detailed reference viewed: 252 (9 UL)
Full Text
Peer Reviewed
See detailMassive Superpoly Recovery with Nested Monomial Predictions
Hu, Kai; Sun, Siwei; Todo, Yosuke et al

in Advances in Cryptology - ASIACRYPT 2021 - 27th International Conference on the Theory and Application of Cryptology and Information Security Singapore, December 6-10, 2021, Proceedings, Part I (2021, December)

Detailed reference viewed: 28 (0 UL)
Full Text
Peer Reviewed
See detailLearning-Assisted User Clustering in Cell-Free Massive MIMO-NOMA Networks
Le; Nguyen, van Dinh UL; Dobre, Octavia A. et al

in IEEE Transactions on Vehicular Technology (2021), 70(12), 12872-12887

The superior spectral efficiency (SE) and user fairness feature of non-orthogonal multiple access (NOMA) systems are achieved by exploiting user clustering (UC) more efficiently. However, a random UC ... [more ▼]

The superior spectral efficiency (SE) and user fairness feature of non-orthogonal multiple access (NOMA) systems are achieved by exploiting user clustering (UC) more efficiently. However, a random UC certainly results in a suboptimal solution while an exhaustive search method comes at the cost of high complexity, especially for systems of medium-to-large size. To address this problem, we develop two efficient unsupervised machine learning based UC algorithms, namely k-means++ and improved k-means++, to effectively cluster users into disjoint clusters in cell-free massive multiple-input multiple-output (CFmMIMO) system. Adopting full-pilot zero-forcing at access points (APs) to comprehensively assess the system performance, we formulate the sum SE optimization problem taking into account power constraints at APs, necessary conditions for implementing successive interference cancellation, and required SE constraints at user equipments. The formulated optimization problem is highly non-convex, and thus, it is difficult to obtain the global optimal solution. Therefore, we develop a simple yet efficient iterative algorithm for its solution. In addition, the performance of collocated massive MIMO-NOMA (COmMIMO-NOMA) system is also characterized. Numerical results are provided to show the superior performance of the proposed UC algorithms compared to baseline schemes. The effectiveness of applying NOMA in CFmMIMO and COmMIMO systems is also validated. [less ▲]

Detailed reference viewed: 67 (10 UL)
See detailGlobal land dampness characterization using reflectometry by students (GOLDCREST): mission and CubeSat design
Asadi, Niloofar; Gholami-Boroujeni, Farzaneh; Gogoi, Bishwajit et al

in Proceedings of the 12th European CubeSat symposium (2021, November 15)

Detailed reference viewed: 115 (15 UL)
Full Text
See detailSolar-Aerodynamic Formation Flight for 5G Experiments
Thoemel, Jan UL; Querol, Jorge UL; Bokal, Zhanna UL et al

in Proceedings of the 12th European CubeSatSymposium (2021, November 15)

Detailed reference viewed: 99 (24 UL)
Full Text
Peer Reviewed
See detailRefining Weakly-Supervised Free Space Estimation through Data Augmentation and Recursive Training
Robinet, François UL; Frank, Raphaël UL

in Proceedings of BNAIC/BeneLearn 2021 (2021, November 12)

Free space estimation is an important problem for autonomous robot navigation. Traditional camera-based approaches rely on pixel-wise ground truth annotations to train a segmentation model. To cover the ... [more ▼]

Free space estimation is an important problem for autonomous robot navigation. Traditional camera-based approaches rely on pixel-wise ground truth annotations to train a segmentation model. To cover the wide variety of environments and lighting conditions encountered on roads, training supervised models requires large datasets. This makes the annotation cost prohibitively high. In this work, we propose a novel approach for obtaining free space estimates from images taken with a single road-facing camera. We rely on a technique that generates weak free space labels without any supervision, which are then used as ground truth to train a segmentation model for free space estimation. We study the impact of different data augmentation techniques on the performances of free space predictions, and propose to use a recursive training strategy. Our results are benchmarked using the Cityscapes dataset and improve over comparable published work across all evaluation metrics. Our best model reaches 83.64% IoU (+2.3%), 91:75% Precision (+2.4%) and 91.29% Recall (+0.4%). These results correspond to 88.8% of the IoU, 94.3% of the Precision and 93.1% of the Recall obtained by an equivalent fully-supervised baseline, while using no ground truth annotation. Our code and models are freely available online. [less ▲]

Detailed reference viewed: 114 (29 UL)
Full Text
See detailAt the Edge of a Seamless Cloud Experience
Rac, Samuel UL; Brorsson, Mats Hakan UL

E-print/Working paper (2021)

There is a growing need for low latency for many devices and users. The traditional cloud computing paradigm can not meet this requirement, legitimizing the need for a new paradigm. Edge computing ... [more ▼]

There is a growing need for low latency for many devices and users. The traditional cloud computing paradigm can not meet this requirement, legitimizing the need for a new paradigm. Edge computing proposes to move computing capacities to the edge of the network, closer to where data is produced and consumed. However, edge computing raises new challenges. At the edge, devices are more heterogeneous than in the data centre, where everything is optimized to achieve economies of scale. Edge devices can be mobile, like a car, which complicates architecture with dynamic topologies. IoT devices produce a considerable amount of data that can be processed at the Edge. In this paper, we discuss the main challenges to be met in edge computing and solutions to achieve a seamless cloud experience. We propose to use technologies like containers and WebAssembly to manage applications' execution on heterogeneous devices. [less ▲]

Detailed reference viewed: 18 (10 UL)
Full Text
Peer Reviewed
See detailLeveraging High-Frequency Components for Deepfake Detection
Mejri, Nesryne UL; Papadopoulos, Konstantinos UL; Aouada, Djamila UL

in IEEE Workshop on Multimedia Signal Processing (2021)

In the past years, RGB-based deepfake detection has shown notable progress thanks to the development of effective deep neural networks. However, the performance of deepfake detectors remains primarily ... [more ▼]

In the past years, RGB-based deepfake detection has shown notable progress thanks to the development of effective deep neural networks. However, the performance of deepfake detectors remains primarily dependent on the quality of the forged content and the level of artifacts introduced by the forgery method. To detect these artifacts, it is often necessary to separate and analyze the frequency components of an image. In this context, we propose to utilize the high-frequency components of color images by introducing an end-to-end trainable module that (a) extracts features from high-frequency components and (b) fuses them with the features of the RGB input. The module not only exploits the high-frequency anomalies present in manipulated images but also can be used with most RGB-based deepfake detectors. Experimental results show that the proposed approach boosts the performance of state-of-the-art networks, such as XceptionNet and EfficientNet, on a challenging deepfake dataset. [less ▲]

Detailed reference viewed: 235 (69 UL)
See detailAutomatic Analysis, Representation and Reconstruction of Textured 3D Human Scans
Saint, Alexandre Fabian A UL

Doctoral thesis (2021)

Various practical applications in computer vision are related to the human body. These involve representing and modelling the body shape, pose, clothing and appearance with mathematical and statistical ... [more ▼]

Various practical applications in computer vision are related to the human body. These involve representing and modelling the body shape, pose, clothing and appearance with mathematical and statistical tools requiring datasets of examples, representative of the variation in the data. Three-dimensional (3D) data is especially important as it allows to simulate the physical world directly, for example to analyse and lift ambiguities in other prevalent data modalities, such as images. However, existing datasets of 3D human scans show limitations in their size, diversity, quality or annotation. This reduces their applicability in tackling research questions around the 3D human body. Two particular applications of interest that remain unanswered are the estimation of body shape under clothing, and the completion of textured shape of missing or defective data. This thesis proposes three main contributions. First, 3DBodyTex, a dataset of 3D human scans, which complements alternative datasets with real scans, body and clothing scans, hundreds of subjects, high-resolution texture information, dense annotations and aligned body shapes under the clothing. The aim is to enable and facilitate new research possibilities with learning-based methods, in 3D or using derived modalities. Second, to build this dataset automatically from raw scans, multiple robust 3D processing methods are proposed. These involve pose estimation, pose fitting, tight body shape fitting, and body shape estimation under clothing. The proposed methods show competitive or improved results on existing benchmarks and new proposed benchmarks based on 3DBodyTex. In particular, an alternative method is proposed to estimate the body shape under clothing from a single scan. On independent benchmarks, it is competitive with, or better than, methods requiring a full time sequence of scans. Third, the task of shape and texture completion of 3D human scans is tackled. A new method is proposed that completes the shape and the texture sequentially, and automatically identifies the missing regions. In particular, partial convolutions are extended to texture images (UV maps) for inpainting the colour of a 3D scan using a convolutional neural network. A new benchmark, based on 3DBodyTex, is proposed for the evaluation. [less ▲]

Detailed reference viewed: 76 (7 UL)
Full Text
Peer Reviewed
See detailDigital Identities and Verifiable Credentials
Sedlmeir, Johannes UL; Smethurst, Reilly UL; Rieger, Alexander UL et al

in Business and Information Systems Engineering (2021), 63(5), 603-613

Public institutions and companies typically employ physical credentials (such as passports, social security cards, and employee badges) to identify individuals. Individuals can choose where to store their ... [more ▼]

Public institutions and companies typically employ physical credentials (such as passports, social security cards, and employee badges) to identify individuals. Individuals can choose where to store their physical credentials, and sometimes, they can decide to whom their credentials are disclosed. These familiar privileges inspired a new type of digital credential called a verifiable credential (VC). Similar to physical credentials, individuals can store their verifiable credentials in a so-called digital wallet on their mobile phone, on another edge device, or in the cloud, and they can use verifiable credentials for identification, authentication, and authorization. [less ▲]

Detailed reference viewed: 77 (6 UL)
Full Text
Peer Reviewed
See detailSPON: Enabling Resilient Inter-Ledgers Payments with an Intrusion-Tolerant Overlay
Trestioreanu, Lucian Andrei UL; Nita-Rotaru, Cristina; Malhotra, Aanchal et al

Scientific Conference (2021, October 04)

Payment systems are a critical component of everyday life in our society. While in many situations payments are still slow, opaque, siloed, expensive or even fail, users expect them to be fast ... [more ▼]

Payment systems are a critical component of everyday life in our society. While in many situations payments are still slow, opaque, siloed, expensive or even fail, users expect them to be fast, transparent, cheap, reliable and global. Recent technologies such as distributed ledgers create opportunities for near-real-time, cheaper and more transparent payments. However, in order to achieve a global payment system, payments should be possible not only within one ledger, but also across different ledgers and geographies.In this paper we propose Secure Payments with Overlay Networks (SPON), a service that enables global payments across multiple ledgers by combining the transaction exchange provided by the Interledger protocol with an intrusion-tolerant overlay of relay nodes to achieve (1) improved payment latency, (2) fault-tolerance to benign failures such as node failures and network partitions, and (3) resilience to BGP hijacking attacks. We discuss the design goals and present an implementation based on the Interledger protocol and Spines overlay network. We analyze the resilience of SPON and demonstrate through experimental evaluation that it is able to improve payment latency, recover from path outages, withstand network partition attacks, and disseminate payments fairly across multiple ledgers. We also show how SPON can be deployed to make the communication between different ledgers resilient to BGP hijacking attacks. [less ▲]

Detailed reference viewed: 58 (4 UL)
Full Text
Peer Reviewed
See detailNB-IoT Random Access for Non-Terrestrial Networks: Preamble Detection and Uplink Synchronization
Chougrani, Houcine UL; Kisseleff, Steven UL; Alves Martins, Wallace UL et al

in IEEE Internet of Things Journal (2021)

The satellite component is recognized as a promising solution to complement and extend the coverage of future Internet of things (IoT) terrestrial networks (TNs). In this context, a study item to ... [more ▼]

The satellite component is recognized as a promising solution to complement and extend the coverage of future Internet of things (IoT) terrestrial networks (TNs). In this context, a study item to integrate satellites into narrowband-IoT (NBIoT) systems has been approved within the 3rd Generation Partnership Project (3GPP) standardization body. However, as NBIoT systems were initially conceived for TNs, their basic design principles and operation might require some key modifications when incorporating the satellite component. These changes in NB-IoT systems, therefore, need to be carefully implemented in order to guarantee a seamless integration of both TN and non-terrestrial network (NTN) for a global coverage. This paper addresses this adaptation for the random access (RA) step in NBIoT systems, which is in fact the most challenging aspect in the NTN context, for it deals with multi-user time-frequency synchronization and timing advance for data scheduling. In particular, we propose an RA technique which is robust to typical satellite channel impairments, including long delays, significant Doppler effects, and wide beams, without requiring any modification to the current NB-IoT RA waveform. Performance evaluations demonstrate the proposal’s capability of addressing different NTN configurations recently defined by 3GPP for the 5G new radio system. [less ▲]

Detailed reference viewed: 110 (13 UL)
Full Text
Peer Reviewed
See detailA Theoretical Framework for Understanding the Relationship Between Log Parsing and Anomaly Detection
Shin, Donghwan UL; Khan, Zanis Ali UL; Bianculli, Domenico UL et al

in Proceedings of the 21st International Conference on Runtime Verification (2021, October)

Log-based anomaly detection identifies systems' anomalous behaviors by analyzing system runtime information recorded in logs. While many approaches have been proposed, all of them have in common an ... [more ▼]

Log-based anomaly detection identifies systems' anomalous behaviors by analyzing system runtime information recorded in logs. While many approaches have been proposed, all of them have in common an essential pre-processing step called log parsing. This step is needed because automated log analysis requires structured input logs, whereas original logs contain semi-structured text printed by logging statements. Log parsing bridges this gap by converting the original logs into structured input logs fit for anomaly detection. Despite the intrinsic dependency between log parsing and anomaly detection, no existing work has investigated the impact of the "quality" of log parsing results on anomaly detection. In particular, the concept of "ideal" log parsing results with respect to anomaly detection has not been formalized yet. This makes it difficult to determine, upon obtaining inaccurate results from anomaly detection, if (and why) the root cause for such results lies in the log parsing step. In this short paper, we lay the theoretical foundations for defining the concept of "ideal" log parsing results for anomaly detection. Based on these foundations, we discuss practical implications regarding the identification and localization of root causes, when dealing with inaccurate anomaly detection, and the identification of irrelevant log messages. [less ▲]

Detailed reference viewed: 165 (29 UL)
Full Text
Peer Reviewed
See detailSpecifying Properties over Inter-Procedural, Source Code Level Behaviour of Programs
Dawes, Joshua UL; Bianculli, Domenico UL

in Proceedings of the 21st International Conference on Runtime Verification (2021, October)

The problem of verifying a program at runtime with respect to some formal specification has led to the development of a rich collection of specification languages. These languages often have a high level ... [more ▼]

The problem of verifying a program at runtime with respect to some formal specification has led to the development of a rich collection of specification languages. These languages often have a high level of abstraction and provide sophisticated modal operators, giving a high level of expressiveness. In particular, this makes it possible to express properties concerning the source code level behaviour of programs. However, for many languages, the correspondence between events generated at the source code level and parts of the specification in question would have to be carefully defined. To enable expressing — using a temporal logic — properties over source code level behaviour without the need for this correspondence, previous work introduced Control-Flow Temporal Logic (CFTL), a specification language with a low level of abstraction with respect to the source code of programs. However, this work focused solely on the intra-procedural setting. In this paper, we address this limitation by introducing Inter-procedural CFTL, a language for expressing source code level, inter-procedural properties of program runs. We evaluate the new language, iCFTL, via application to a real-world case study. [less ▲]

Detailed reference viewed: 295 (63 UL)
Full Text
See detailDATA DISTRIBUTION API SPECIFICATION
Blanco, Braulio UL; Brorsson, Mats Hakan UL

Report (2021)

The second deliverable for the Script Project: API Specification

Detailed reference viewed: 60 (15 UL)
Full Text
See detailPredictive Assistance for Security Risk Assessment
Bettaieb, Seifeddine UL

Doctoral thesis (2021)

In many domains such as healthcare and banking and most notably the Fintech industry, IT systems can be exposed to breaches or attacks and need to fulfill various requirements related to security to ... [more ▼]

In many domains such as healthcare and banking and most notably the Fintech industry, IT systems can be exposed to breaches or attacks and need to fulfill various requirements related to security to prevent such scenarios from happening while limiting any potential exposure. In order to demonstrate or establish that compliance, risk assessments are conducted to determine potential threats and vulnerabilities that a system might be exposed to, as well as potential security controls to implement in order to counter those breaches and fulfill the requirements.An important difficulty that analysts have to contend with during that process is sifting through a large number of vulnerabilities and security controls and determining which ones have a bearing on a given system. This challenge is often exacerbated by the scarce security expertise available in most organizations. In addition, risk assessments are conducted manually in a traditional approach and rely heavily on the expertise of available risk assessors. This turns manually eliciting the applicable vulnerabilities and controls into a lengthy, costly, tedious, and error-prone activity. Our goal is to develop an automated approach to provide decision support during that process by allowing the system to assist in the identification of vulnerabilities and security controls that are relevant to a particular context. Our approach, which is based on Machine Learning (ML), leverages historical data from security assessments performed over past systems in order to recommend applicable vulnerabilities and controls for a new system. Natural Language Processing (NLP) techniques are used in combination with ML to extract any useful information from those previous records e.g.: data from a project's internal and external environment including its scope, involved assets, collaborators,etc. We operationalize and empirically evaluate our approach using real historical data from the banking domain.The automation of such a process raises several challenges: Understanding the specifics of risk assessments is the first one and using the right tools to obtain the desired results is a second one. In fact, in addition to requiring the right data and features in combination with the proper ML techniques, existing NLP techniques are not built to handle the textual data in risk assessments with its technicalities or multilingualism. An additional challenge is to find a suitable knowledge representation for risk assessments that would enable the automation of decision-support while maintaining both cohesiveness and understandability from all involved stakeholders. In this dissertation, we investigate to which extent one can automatically provide recommendations during a risk assessment. We focus exclusively on Vulnerabilities and Security Controls. All our technical solutions have been developed and empirically evaluated in close collaboration with our industrial partner. [less ▲]

Detailed reference viewed: 108 (14 UL)
Full Text
See detailDevelopment of a Decision Support System for Incorporating Risk Assessments during the System Design of Microsatellites
Pandi Perumal, Raja UL

Doctoral thesis (2021)

The primary purpose of this research is to develop a decision support system for the early design of an optimal and reliable satellite while making the overall conceptual design process more efficient ... [more ▼]

The primary purpose of this research is to develop a decision support system for the early design of an optimal and reliable satellite while making the overall conceptual design process more efficient. Generally, a satellite design process begins with a mission definition followed by the functional design of the satellite system. Beyond this, the design goes through several iterations and eventually results in a detailed satellite system design. Only then does it make sense to feed in the piece-part information to estimate the reliability of the entire satellite system. Predicted reliability from this bottom-up method may sometimes be markedly lower than the requirements. In such case, the maturity of the design is brought down, and mitigation strategies need to be implemented to meet the reliability requirements. Consequently, introducing new or redundant parts as a mitigation approach can violate the previously satisfied requirements such as mass, power and cost. Furthermore, additional design iterations are needed until all the requirements are met. Therefore, this design approach is expensive, inefficient, and can be avoided if reliability is considered from the early design phase. However, the challenge is to simultaneously perform reliability analysis and system design as they are entirely different engineering disciplines. In this research, a decision support system: DESIRA is developed to bridge the gap between these two engineering disciplines and incorporate reliability assessments during the early design phase, thus resulting in a truly optimal satellite design. With its unique features such as Reliability Allocation, Reliability Growth, Multidisciplinary Design Optimization and Reliability-Based Multidisciplinary Design Optimization, DESIRA effectively aids the system design at each maturity level. [less ▲]

Detailed reference viewed: 122 (13 UL)