Browse ORBi

- What it is and what it isn't
- Green Road / Gold Road?
- Ready to Publish. Now What?
- How can I support the OA movement?
- Where can I learn more?

ORBi

Efficient modeling of random heterogeneous materials with an uniform probability density function ; ; Bordas, Stéphane Scientific Conference (2014, July) Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of ``modelling ... [more ▼] Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of ``modelling error''. First, a microscopic ``faithful'' -and potentially intractable- model of the structure is defined. Then, one tries to quantify the effect of the homogenisation procedure on a result that would be obtained by directly using the ``faithful'' model. Such an approach requires (a) the ``faithful'' model to be more representative of the physical phenomena of interest than the homogenised model and (b) a reliable approximation of the result obtained using the "faithful" and intractable model to be available at cheap costs. We focus here on point (b), and more precisely on the extension of the techniques developed in [3][2] to estimate the error due to the homogenisation of linear, spatially random composite materials. Particularly, we will approximate the unknown probability density function by bounding its first moment. In this paper, we will present this idea in more detail, displaying the numerical efficiencies and computational costs related to the error estimation. The fact that the probability density function is uniform is exploited to greatly reduce the computational cost. We will also show some first attempts to correct the homogenised model using non-conforming, weakly intrusive microscopic patches. [less ▲] Detailed reference viewed: 315 (0 UL)Efficient modeling of random heterogeneous materials with an uniform probability density function (slides) ; ; et al Scientific Conference (2014) Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of “modelling ... [more ▼] Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of “modelling error”. First, a microscopic “faithful” -and potentially intractable- model of the structure is defined. Then, one tries to quantify the effect of the homogenisation procedure on a result that would be obtained by directly using the “faithful” model. Such an approach requires (a) the “faithful” model to be more representative of the physical phenomena of interest than the homogenised model and (b) a reliable approximation of the result obtained using the ”faithful” and intractable model to be available at cheap costs. We focus here on point (b), and more precisely on the extension of the techniques devel- oped in [3] [2] to estimate the error due to the homogenisation of linear, spatially random composite materials. Particularly, we will approximate the unknown probability density function by bounding its first moment. In this paper, we will present this idea in more detail, displaying the numerical efficiencies and computational costs related to the error estimation. The fact that the probability density function is uniform is exploited to greatly reduce the computational cost. We will also show some first attempts to correct the homogenised model using non-conforming, weakly intrusive microscopic patches. [less ▲] Detailed reference viewed: 235 (1 UL)Implementation of a XFEM toolbox in Diffpack ; ; et al in International Conference on Extended Finite Element Methods - XFEM 2013, September 11 – 13, 2013, Lyon, France (2013) The Diffpack Development Framework is an object-oriented software environment for the numerical solution of partial differential equations (PDEs). By its design, Diffpack intends to close the gap between ... [more ▼] The Diffpack Development Framework is an object-oriented software environment for the numerical solution of partial differential equations (PDEs). By its design, Diffpack intends to close the gap between black-box simulation packages and technical computing environments using interpreted computer languages. The framework provides a high degree of modeling flexibility, while still offering the computational efficiency needed for most demanding simulation problems in science and engineering. Technically speaking, Diffpack is a collection of C++ libraries with classes, functions and utility programs. The numerical functionality is embedded in an environment of software engineering tools supporting the management of Diffpack development projects. Diffpack supports a variety of numerical methods with distinct focus on the finite element method (FEM) but has no inherent restrictions on the types of PDEs and therefore applications to be solved. The key point of partition of unity enriched methods such as XFEM and GFEM is to help capture discontinuities and singularities or large gradients in solutions, which are not well resolved by h or prefinement [1]. The general idea is that the mesh need not conform to the moving boundaries so that minimal or no remeshing is required during the analysis. Our main motivation is to provide a generic implementation of enrichment within a flexible C++ environment, namely the Diffpack platform. The work was inspired by some of our earlier work [6,9] and that of other colleagues [5,7,8]. We demonstrate how object-oriented programming is particularly useful for the treatment of data structures and operations associated with XFEM : mesh-geometry interaction, non-standard integration rules, application of boundary conditions, treatment of level set data [2,6]. We detail the implementation of such features and verify and validate their implementation based on [5]. We show results based on unshifted, shifted [1] and study the behaviour of the stable generalized finite element method (SGFEM) to avoid blending effects and help control the conditioning of the system matrix [4]. For integration of elements cut by interface we use an in-house Delaunay Triangulation algorithm proposed by [3,5] and presented in detail in a companion paper. [less ▲] Detailed reference viewed: 399 (12 UL)Stable extended finite element method: Convergence, Accuracy, Properties and Diffpack implementation ; ; Bordas, Stéphane et al in International Conference on Extended Finite Element Methods - XFEM 2013, September 11 – 13, 2013, Lyon, France (2013) Problems involving singularities and moving boundaries, especially when they involve discontinuities, create difficulties for the finite element method. On another, albeit related, front, two diametrally ... [more ▼] Problems involving singularities and moving boundaries, especially when they involve discontinuities, create difficulties for the finite element method. On another, albeit related, front, two diametrally opposed approaches are attempting to simplify the CAD to Analysis pipeline: isogeometric methods on the one hand [1] aim at coupling the geometry and field approximations, whilst implicit boundary definition-based methods attempt to decouple them [3,4,5]. We examine in this paper one instance of the latter approach, and rely on partition of unity enrichment of the field variable to capture discontinuities along material interface or domain boundaries. We study in particular the stable generalized finite element method of Babuˇka and Banerjee [6] for higher order approximations in two and three dimensions and propose a generic implementation within the C++ library Diffpack from inuTech GmbH [7]. In a companion paper, the implementation of enrichment within Diffpack is presented in more detail. We will present results obtained with our 3D implementation of partition of unity enrichment within Diffpack. This implementation represents the interfaces through level-sets and palliates blending problems using various approaches. We study here the stabilisation approach proposed in [6] in more detail and pay particular attention to the global convergence rate of the approach and to the stability and the local flux converence close to the interfaces. [less ▲] Detailed reference viewed: 320 (1 UL) |
||