Browse ORBi

- What it is and what it isn't
- Green Road / Gold Road?
- Ready to Publish. Now What?
- How can I support the OA movement?
- Where can I learn more?

ORBi

H2 Norm Based Network Volatility Measures ; ; Goncalves, Jorge et al in The proceedings of the American Control Conference (2014) Motivated by applications in biology and economics, we propose new volatility measures based on the H2 system norm for linear networks stimulated by independent or correlated noise. We identify critical ... [more ▼] Motivated by applications in biology and economics, we propose new volatility measures based on the H2 system norm for linear networks stimulated by independent or correlated noise. We identify critical links in a network, where relatively small improvements can lead to large reductions in network volatility measures. We also examine volatility measures of individual nodes and their dependence on the topological position in the network. Finally, we investigate the dependence of the volatility on different network interconnections, weights of the edges and other network properties. Hence in an intuitive and efficient way, we can identify critical links, nodes and interconnections in network which can shed light in the network design to make it more robust. [less ▲] Detailed reference viewed: 80 (5 UL)Heterogeneous agent models in economics: a study of heterogeneous productivity of sectors ; Goncalves, Jorge ; in Proceedings of the 2008 American Control Conference (2008) Macroeconomic modeling is undergoing a change from the ground up. Previously models based on fully rational representative agents were constructed to give macroeconomics soli microeconomic foundations ... [more ▼] Macroeconomic modeling is undergoing a change from the ground up. Previously models based on fully rational representative agents were constructed to give macroeconomics soli microeconomic foundations. However the representative agent models have been shown to be inconsistent with empirical evidence and a new method of approach has emerged, one based on heterogeneity of agents. Recently heterogenous models have been used to simulate expected outcomes but due to their complexity little analytic work has been done. In this paper a basic model of the macro economy, with heterogeneous sectors differentiated by productivity, and driven by a jump Markov process, is investigated and steady state solutions for a sector’s output variance are discovered. We adjust the model to include a gain term, to represent a sector’s reaction to its error signal, excess demand, and then linearize the transition rates and apply the fluctuation dissipation theorem to solve the model. [less ▲] Detailed reference viewed: 41 (0 UL)High Performance Parallel Coupling of OpenFOAM+XDEM Besseron, Xavier ; ; Rousset, Alban et al Presentation (2019, June 21) Detailed reference viewed: 103 (6 UL)Higher-order quasicontinuum methods for elastic and dissipative lattice models: uniaxial deformation and pure bending Beex, Lars ; ; et al in GAMM Mitteilungen (2015), 38(2), 344-368 The quasicontinuum (QC) method is a numerical strategy to reduce the computational cost of direct lattice computations - in this study we achieve a speed up of a factor of 40. It has successfully been ... [more ▼] The quasicontinuum (QC) method is a numerical strategy to reduce the computational cost of direct lattice computations - in this study we achieve a speed up of a factor of 40. It has successfully been applied to (conservative) atomistic lattices in the past, but using a virtual-power-statement it was recently shown that QC approaches can also be used for spring and beam lattice models that include dissipation. Recent results have shown that QC approaches for planar beam lattices experiencing in-plane and out-of-plane deformation require higher-order interpolation. Higher-order QC frameworks are scarce nevertheless. In this contribution, the possibilities of a second-order and third-order QC framework are investigated for an elastoplastic spring lattice. The higher-order QC frameworks are compared to the results of the direct lattice computations and to those of a linear QC scheme. Examples are chosen so that both a macroscale and a microscale quantity influences the results. The two multiscale examples focused on are (i) macroscopically prescribed uniaxial deformation and (ii) macroscopically prescribed pure bending. Furthermore, the examples include an individual inclusion in a large lattice and hence, are concurrent in nature. [less ▲] Detailed reference viewed: 269 (33 UL)A holding control strategy for diverging bus lines Laskaris, Georgios ; ; et al Scientific Conference (2018, July 24) We introduce a holding criterion for network configurations with lines that operate jointly along a common corridor and then individually diverge. The proposed holding decision rule accounts for all ... [more ▼] We introduce a holding criterion for network configurations with lines that operate jointly along a common corridor and then individually diverge. The proposed holding decision rule accounts for all different passengers groups in the overlapping segment and takes care of the transition to individual line operation. The holding rule is evaluated using simulation for different demand levels and segmentations and compared with other control schemes for a real-world network. Results show that gains in overall network performance as well as for specific passenger groups can be achieved under specific demand distributions. [less ▲] Detailed reference viewed: 43 (3 UL)How companies learn from design flaws: results from an empirical study of the german manufacturing industry ; Gericke, Kilian ; Blessing, Lucienne in Proceedings of the 15th International Conference on Engineering Design (2005) Design flaws often become apparent at a time when the product is already in use and its development process, which in many cases includes extensive testing of parts, components and prototypes, is ... [more ▼] Design flaws often become apparent at a time when the product is already in use and its development process, which in many cases includes extensive testing of parts, components and prototypes, is considered complete. Such flaws may reach from poor ergonomics to the total failure of the product. Often, especially when user safety is at risk, design flaws are so severe that companies are forced to announce a product callback. Petroski suggests that many (if not most) products, which we are familiar with today, have a long history of previously flawed designs [3]. This implies that designers did indeed learn from design flaws in both senses of the word “learn”: discovering the flaw and utilizing the knowledge gained about it to find a solution. As far as discovering a design flaw is concerned, it can be assumed that the feedback from those who interact with the physical products in practice – the individuals who maintain, repair, recycle but essentially use the products – plays an important role. In their previous work, the authors pointed out hat this feedback information could not only be vital for identifying potential product hazards but helps designers to review the effects of their design measures and therefore to improve their products from generation to generation [4]. In order to obtain a better understanding of how designers learn from design flaws, a mail survey was conducted that aimed at investigating company-, process- and product-related factors of this phenomenon and to answer (among others) the following research questions: • To what extent are design flaws of a company’s (or a competitor’s) product a driving force in the development of new products? • How do the designers of a company become aware of design flaws of their products? • How successful are companies in correcting design flaws? • How do successful and unsuccessful companies differ in terms of size, activity profile of their designers and characteristics of their products? • What are possible factors that influence the success in correcting a design fault? [less ▲] Detailed reference viewed: 77 (2 UL)humix: a microfluidics-based in vitro co-culture device for investigating human-microbial molecular interactions Shah, Pranjul ; Wilmes, Paul Scientific Conference (2012, August 25) Detailed reference viewed: 125 (7 UL)A hybrid T-Trefftz polygonal finite element for linear elasticity ; ; Bordas, Stéphane E-print/Working paper (2014) In this paper, we construct hybrid T-Trefftz polygonal finite elements. The displacement field within the polygon is repre- sented by the homogeneous solution to the governing differential equation, also ... [more ▼] In this paper, we construct hybrid T-Trefftz polygonal finite elements. The displacement field within the polygon is repre- sented by the homogeneous solution to the governing differential equation, also called as the T-complete set. On the boundary of the polygon, a conforming displacement field is independently defined to enforce continuity of the displacements across the element boundary. An optimal number of T-complete functions are chosen based on the number of nodes of the polygon and degrees of freedom per node. The stiffness matrix is computed by the hybrid formulation with auxiliary displacement frame. Results from the numerical studies presented for a few benchmark problems in the context of linear elasticity shows that the proposed method yield highly accurate results. [less ▲] Detailed reference viewed: 92 (3 UL)A hyper-reduction method using adaptivity to cut the assembly costs of reduced order models Hale, Jack ; ; Baroli, Davide et al E-print/Working paper (2019) At every iteration or timestep of the online phase of some reduced-order modelling schemes, large linear systems must be assembled and then projected onto a reduced order basis of small dimension. The ... [more ▼] At every iteration or timestep of the online phase of some reduced-order modelling schemes, large linear systems must be assembled and then projected onto a reduced order basis of small dimension. The projected small linear systems are cheap to solve, but assembly and projection are now the dominant computational cost. In this paper we introduce a new hyper-reduction strategy called reduced assembly (RA) that drastically cuts these costs. RA consists of a triangulation adaptation algorithm that uses a local error indicator to con- struct a reduced assembly triangulation specially suited to the reduced order basis. Crucially, this reduced assembly triangulation has fewer cells than the original one, resulting in lower assembly and projection costs. We demonstrate the efficacy of RA on a Galerkin-POD type reduced order model (RAPOD). We show performance increases of up to five times over the baseline Galerkin-POD method on a non-linear reaction-diffusion problem solved with a semi-implicit time-stepping scheme and up to seven times for a 3D hyperelasticity problem solved with a continuation Newton-Raphson algorithm. The examples are implemented in the DOLFIN finite element solver using PETSc and SLEPc for linear algebra. Full code and data files to produce the results in this paper are provided as supplementary material. [less ▲] Detailed reference viewed: 142 (25 UL)Identification of material thermal stress-strain behaviour for simulation of conditions in metallurgical reactors Maciera Rodrigues, David Doctoral thesis (2011) Detailed reference viewed: 62 (5 UL)Identifying Attribute Importance in Early Product Development. Exemplified by Interactive Technologies and Age Pohlmeyer, Anna Elisabeth Doctoral thesis (2011) Detailed reference viewed: 61 (0 UL)Identifying elastoplastic parameters with Bayes' theorem considering double error sources and model uncertainty Rappel, Hussein ; Beex, Lars ; et al in Probabilistic Engineering Mechanics (2019), 55 We discuss Bayesian inference for the identi cation of elastoplastic material parameters. In addition to errors in the stress measurements, which are commonly considered, we furthermore consider errors in ... [more ▼] We discuss Bayesian inference for the identi cation of elastoplastic material parameters. In addition to errors in the stress measurements, which are commonly considered, we furthermore consider errors in the strain measurements. Since a difference between the model and the experimental data may still be present if the data is not contaminated by noise, we also incorporate the possible error of the model itself. The three formulations to describe model uncertainty in this contribution are: (1) a random variable which is taken from a normal distribution with constant parameters, (2) a random variable which is taken from a normal distribution with an input-dependent mean, and (3) a Gaussian random process with a stationary covariance function. Our results show that incorporating model uncertainty often, but not always, improves the results. If the error in the strain is considered as well, the results improve even more. [less ▲] Detailed reference viewed: 232 (53 UL)Identifying fibre material parameter distributions with little experimental efforts Rappel, Hussein ; Beex, Lars ; Bordas, Stéphane Scientific Conference (2018, July 23) Detailed reference viewed: 63 (15 UL)Identifying material parameter distributions of fibers with extremely limited experimental efforts Rappel, Hussein ; Beex, Lars ; Bordas, Stéphane Scientific Conference (2018, July 22) Detailed reference viewed: 36 (4 UL)Image to analysis pipeline: single and double balloons kyphoplasty Baroli, Davide ; Hauseux, Paul ; Hale, Jack et al Poster (2016, December 12) In this work, we present a semi-automatic pipeline from image to simulation of a patient fractured vertebra after the kyphoplastic augmentation with two balloons. In this procedure, the CT-scan medical ... [more ▼] In this work, we present a semi-automatic pipeline from image to simulation of a patient fractured vertebra after the kyphoplastic augmentation with two balloons. In this procedure, the CT-scan medical image are pre-processed using open-source software Slice3D for segmentation and 3D reconstruction operation. Then, using geometric processing the 3D surface geometry is enhanced to avoid degenerate element and trigging phenomena on vertebra and cement area. We perform a finite element analysis to evaluate the risk of subsequent vertebral fracture. Finally using Monte-Carlo technique, we assess the propagation of uncertainty of material parameter on the evaluation of this risk. Based on the developed semi-automatic pipelines, it is possible to perform a patient-specific simulation that assesses the successful of kyphoplasty operation. [less ▲] Detailed reference viewed: 178 (35 UL)The Impact of Route Choice Modeling on Dynamic OD Estimation ; ; et al in Proceedings of IEEE-ITS Conference (2015, September) Detailed reference viewed: 55 (4 UL)Implementation and validation of an event-based real-time nonlinear model predictive control framework with ROS interface for single and multi-robot systems Dentler, Jan Eric ; Kannan, Somasundar ; Olivares Mendez, Miguel Angel et al in 2017 IEEE Conference on Control Technology and Applications (CCTA) (2017, August 30) This paper presents the implementation and experimental validation of a central control framework. The presented framework addresses the need for a controller, which provides high performance combined ... [more ▼] This paper presents the implementation and experimental validation of a central control framework. The presented framework addresses the need for a controller, which provides high performance combined with a low-computational load while being on-line adaptable to changes in the control scenario. Examples for such scenarios are cooperative control, task-based control and fault-tolerant control, where the system's topology, dynamics, objectives and constraints are changing. The framework combines a fast Nonlinear Model Predictive Control (NMPC), a communication interface with the Robot Operating System (ROS) [1] as well as a modularization that allows an event-based change of the NMPC scenario. To experimentally validate performance and event-based adaptability of the framework, this paper is using a cooperative control scenario of Unmanned Aerial Vehicles (UAVs). The source code of the proposed framework is available under [2]. [less ▲] Detailed reference viewed: 162 (20 UL)Implementation of a XFEM toolbox in Diffpack ; ; et al in International Conference on Extended Finite Element Methods - XFEM 2013, September 11 – 13, 2013, Lyon, France (2013) The Diffpack Development Framework is an object-oriented software environment for the numerical solution of partial differential equations (PDEs). By its design, Diffpack intends to close the gap between ... [more ▼] The Diffpack Development Framework is an object-oriented software environment for the numerical solution of partial differential equations (PDEs). By its design, Diffpack intends to close the gap between black-box simulation packages and technical computing environments using interpreted computer languages. The framework provides a high degree of modeling flexibility, while still offering the computational efficiency needed for most demanding simulation problems in science and engineering. Technically speaking, Diffpack is a collection of C++ libraries with classes, functions and utility programs. The numerical functionality is embedded in an environment of software engineering tools supporting the management of Diffpack development projects. Diffpack supports a variety of numerical methods with distinct focus on the finite element method (FEM) but has no inherent restrictions on the types of PDEs and therefore applications to be solved. The key point of partition of unity enriched methods such as XFEM and GFEM is to help capture discontinuities and singularities or large gradients in solutions, which are not well resolved by h or prefinement [1]. The general idea is that the mesh need not conform to the moving boundaries so that minimal or no remeshing is required during the analysis. Our main motivation is to provide a generic implementation of enrichment within a flexible C++ environment, namely the Diffpack platform. The work was inspired by some of our earlier work [6,9] and that of other colleagues [5,7,8]. We demonstrate how object-oriented programming is particularly useful for the treatment of data structures and operations associated with XFEM : mesh-geometry interaction, non-standard integration rules, application of boundary conditions, treatment of level set data [2,6]. We detail the implementation of such features and verify and validate their implementation based on [5]. We show results based on unshifted, shifted [1] and study the behaviour of the stable generalized finite element method (SGFEM) to avoid blending effects and help control the conditioning of the system matrix [4]. For integration of elements cut by interface we use an in-house Delaunay Triangulation algorithm proposed by [3,5] and presented in detail in a companion paper. [less ▲] Detailed reference viewed: 401 (12 UL)Implementation of regularized isogeometric boundary element methods for gradient-based shape optimization in two-dimensional linear elasticity ; ; Bordas, Stéphane in International Journal for Numerical Methods in Engineering (2015) Detailed reference viewed: 187 (16 UL)An Implicit boundary approach for viscous compressible high Reynolds flows using hybrid remeshed particle hydrodynamics method Obeidat, Anas ; Bordas, Stéphane in Journal of Computational Physics (in press) Detailed reference viewed: 179 (21 UL) |
||