Browse ORBi

- What it is and what it isn't
- Green Road / Gold Road?
- Ready to Publish. Now What?
- How can I support the OA movement?
- Where can I learn more?

ORBi

Functional Modelling Perspectives Across Disciplines: A Literature Review Eisenbart, Boris ; Blessing, Lucienne ; Gericke, Kilian in Proceedings of the 12th International Design Conference - DESIGN 2012 (2012) The research presented in this paper discusses the different understandings of function which hamper shared functional modelling. Function models proposed in literature from various disciplines are then ... [more ▼] The research presented in this paper discusses the different understandings of function which hamper shared functional modelling. Function models proposed in literature from various disciplines are then analysed, in order to identify the different inherent functional modelling perspectives. The paper concludes that in order to support shared functional modelling and to support cross-disciplinary system development, these different functional modelling perspectives need to be linked. [less ▲] Detailed reference viewed: 125 (7 UL)Fundamental Solutions and Dual Boundary Element Method for Crack Problems in Plane Cosserat Elasticity ; Bordas, Stéphane in Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences (2014) In this paper, both singular and hypersingular fundamental solutions of plane Cosserat elasticity are derived and given in a ready-to-use form. The hypersingular fundamental solutions allow to formulate ... [more ▼] In this paper, both singular and hypersingular fundamental solutions of plane Cosserat elasticity are derived and given in a ready-to-use form. The hypersingular fundamental solutions allow to formulate the analogue of Somigliana stress identity, which can be used to obtain the stress and couple stress fields inside the domain from the boundary values of the displacements, microrotation and stress and couple stress tractions. Using these newly derived fundamental solutions, the boundary integral equations of both types are formulated and solved by the boundary element method. Simultaneous use of both types of the equations (approach known as the dual BEM) allows to treat problems where parts of the boundary are overlapping, such as crack problems, and to do this for general geometry and loading conditions. The high accuracy of the boundary element method for both types of equations is demonstrated for a number of benchmark problems, including a Griffith crack problem and a plate with an edge crack. The detailed comparison of the BEM-results and the analytical solution for a Griffith crack is given, particularly, in terms of stress and couple stress intensity factors, as well as the crack opening displacements and microrotations on the crack faces. A modified method for computing the couple stress intensity factors is also proposed and evaluated. Finally, the asymptotic behavior of the solution to the Cosserat crack problems, in the vicinity of the crack tip is analyzed. [less ▲] Detailed reference viewed: 709 (13 UL)Fundamental solutions and dual boundary element methods for fracture in plane Cosserat elasticity ; Bordas, Stéphane in Proceedings of the Royal Society a-Mathematical Physical and Engineering Sciences (2015), 471(2179), In this paper, both singular and hypersingular fundamental solutions of plane Cosserat elasticity are derived and given in a ready-to-use form. The hypersingular fundamental solutions allow to formulate ... [more ▼] In this paper, both singular and hypersingular fundamental solutions of plane Cosserat elasticity are derived and given in a ready-to-use form. The hypersingular fundamental solutions allow to formulate the analogue of Somigliana stress identity, which can be used to obtain the stress and couple-stress fields inside the domain from the boundary values of the displacements, microrotation and stress and couple-stress tractions. Using these newly derived fundamental solutions, the boundary integral equations of both types are formulated and solved by the boundary element method. Simultaneous use of both types of equations (approach known as the dual boundary element method (BEM)) allows problems where parts of the boundary are overlapping, such as crack problems, to be treated and to do this for general geometry and loading conditions. The high accuracy of the boundary element method for both types of equations is demonstrated for a number of benchmark problems, including a Griffith crack problem and a plate with an edge crack. The detailed comparison of the BEM results and the analytical solution for a Griffith crack and an edge crack is given, particularly in terms of stress and couple-stress intensity factors, as well as the crack opening displacements and microrotations on the crack faces and the angular distributions of stresses and couple-stresses around the crack tip. [less ▲] Detailed reference viewed: 135 (2 UL)Fusing the Seth-Hill strain tensors to fit compressible elastic material responses in the nonlinear regime Beex, Lars in International Journal of Mechanical Sciences (2019), 163 Strain energy densities based on the Seth-Hill strain tensors are often used to describe the hyperelastic mechanical behaviours of isotropic, transversely isotropic and orthotropic materials for ... [more ▼] Strain energy densities based on the Seth-Hill strain tensors are often used to describe the hyperelastic mechanical behaviours of isotropic, transversely isotropic and orthotropic materials for relatively large deformations. Since one parameter distinguishes which strain tensor of the Seth-Hill family is used, one has in theory the possibility to t the material response in the nonlinear regime. Most often for compressible deformations however, this parameter is selected such that the Hencky strain tensor is recovered, because it yields rather physical stress-strain responses. Hence, the response in the nonlinear regime is in practise not often tailored to match experimental data. To ensure that elastic responses in the nonlinear regime can more accurately be controlled, this contribution proposes three generalisations that combine several Seth-Hill strain tensors. The generalisations are formulated such that the stress-strain responses for in finitesimal deformations remain unchanged. Consequently, the identifi cation of the Young's moduli, Poisson's ratios and shear moduli is not a ffected. 3D fi nite element simulations are performed for isotropy and orthotropy, with an emphasis on the identifi cation of the new material parameters. [less ▲] Detailed reference viewed: 199 (14 UL)Future Issues in Design Research Blessing, Lucienne in Lindemann, Udo (Ed.) Human Behaviour in Design. Individuals, Teams, Tools (2003) This chapter contains in its first part the core future issues I derived from the presentations and discussion sessions at the conference on "Human Behvaiour and Design : Individuals, Teams, Tools" as ... [more ▼] This chapter contains in its first part the core future issues I derived from the presentations and discussion sessions at the conference on "Human Behvaiour and Design : Individuals, Teams, Tools" as well as personal reflections. The second part presents my personal views on the most urgent issues in the area of human behaviour in design, and ways to proceed. [less ▲] Detailed reference viewed: 122 (0 UL)GA4GH Passport standard for digital identity and access permissions ; ; et al in Cell Genomics (2021), 1(2), Detailed reference viewed: 35 (0 UL)A Game Theoretic Analysis of the Twitter Follow-Unfollow Mechanism ; ; Brust, Matthias R. et al in International Conference on Decision and Game Theory for Security (2018) Twitter users often crave more followers to increase their social popularity. While a variety of factors have been shown to attract the followers, very little work has been done to analyze the mechanism ... [more ▼] Twitter users often crave more followers to increase their social popularity. While a variety of factors have been shown to attract the followers, very little work has been done to analyze the mechanism how Twitter users follow or unfollow each other. In this paper, we apply game theory to modeling the follow-unfollow mechanism on Twitter. We first present a two-player game which is based on the Prisoner’s Dilemma, and subsequently evaluate the payoffs when the two players adopt different strategies. To allow two players to play multiple rounds of the game, we propose a multi-stage game model. We design a Twitter bot analyzer which follows or unfollows other Twitter users by adopting the strategies from the multi-stage game. We develop an algorithm which enables the Twitter bot analyzer to automatically collect and analyze the data. The results from analyzing the data collected in our experiment show that the follow-back ratios for both of the Twitter bots are very low, which are 0.76% and 0.86%. This means that most of the Twitter users do not cooperate and only want to be followed instead of following others. Our results also exhibit the effect of different strategies on the follow-back followers and on the non-following followers as well. [less ▲] Detailed reference viewed: 139 (0 UL)Gamifying the Commute McCall, Roderick Presentation (2014, November 14) The seminar outlined the I-GEAR project which examines the use of gamification to reduce traffic congestion. Topics included user interfaces for gamified applications, requirements capture methodologies ... [more ▼] The seminar outlined the I-GEAR project which examines the use of gamification to reduce traffic congestion. Topics included user interfaces for gamified applications, requirements capture methodologies and a sample gamified application. [less ▲] Detailed reference viewed: 183 (13 UL)Gene expression data analysis using spatiotemporal blind source separation Sainlez, Matthieu ; ; in Verleysen, Michel (Ed.) ESANN'2009 proceedings, European Symposium on Artificial Neural Networks - Advances in Computational Intelligence and Learning. (2009) We propose a “time-biased” and a “space-biased” method for spatiotemporal independent component analysis (ICA). The methods rely on computing an orthogonal approximate joint diagonalizer of a collection ... [more ▼] We propose a “time-biased” and a “space-biased” method for spatiotemporal independent component analysis (ICA). The methods rely on computing an orthogonal approximate joint diagonalizer of a collection of covariance-like matrices. In the time-biased version, the time signatures of the ICA modes are imposed to be white, whereas the space-biased version imposes the same condition on the space signatures. We apply the two methods to the analysis of gene expression data, where the genes play the role of the space and the cell samples stand for the time. This study is a step towards addressing a question first raised by Liebermeister, on whether ICA methods for gene expression analysis should impose independence across genes or across cell samples. Our preliminary experiment indicates that both approaches have value, and that exploring the continuum between these two extremes can provide useful information about the interactions between genes and their impact on the phenotype. [less ▲] Detailed reference viewed: 29 (0 UL)Generalising from conventional pipelines using deep learning in high‑throughput screening workfows Garcia Santa Cruz, Beatriz ; ; Gomez Giro, Gemma et al in Scientific Reports (2022) The study of complex diseases relies on large amounts of data to build models toward precision medicine. Such data acquisition is feasible in the context of high-throughput screening, in which the quality ... [more ▼] The study of complex diseases relies on large amounts of data to build models toward precision medicine. Such data acquisition is feasible in the context of high-throughput screening, in which the quality of the results relies on the accuracy of the image analysis. Although state-of-the-art solutions for image segmentation employ deep learning approaches, the high cost of manually generating ground truth labels for model training hampers the day-to-day application in experimental laboratories. Alternatively, traditional computer vision-based solutions do not need expensive labels for their implementation. Our work combines both approaches by training a deep learning network using weak training labels automatically generated with conventional computer vision methods. Our network surpasses the conventional segmentation quality by generalising beyond noisy labels, providing a 25% increase of mean intersection over union, and simultaneously reducing the development and inference times. Our solution was embedded into an easy-to-use graphical user interface that allows researchers to assess the predictions and correct potential inaccuracies with minimal human input. To demonstrate the feasibility of training a deep learning solution on a large dataset of noisy labels automatically generated by a conventional pipeline, we compared our solution against the common approach of training a model from a small manually curated dataset by several experts. Our work suggests that humans perform better in context interpretation, such as error assessment, while computers outperform in pixel-by-pixel fne segmentation. Such pipelines are illustrated with a case study on image segmentation for autophagy events. This work aims for better translation of new technologies to real-world settings in microscopy-image analysis. [less ▲] Detailed reference viewed: 213 (18 UL)Generalization of the primary tone phase variation method: An exclusive way of isolating the frequency-following response components Lucchetti, Federico ; ; et al in The Journal of the Acoustical Society of America (2018) Detailed reference viewed: 16 (0 UL)Generalized Information Theory based on the Theory of Hints Pouly, Marc in Liu, Weiru (Ed.) Symbolic and Quantitative Approaches to Reasoning with Uncertainty (2011) The aggregate uncertainty is the only known functional for Dempster-Shafer theory that generalizes the Shannon and Hartley mea- sures and satis?es all classical requirements for uncertainty measures ... [more ▼] The aggregate uncertainty is the only known functional for Dempster-Shafer theory that generalizes the Shannon and Hartley mea- sures and satis?es all classical requirements for uncertainty measures, including subadditivity. Although being posed several times in the liter- ature, it is still an open problem whether the aggregate uncertainty is unique under these properties. This paper derives an uncertainty measure based on the theory of hints and shows its equivalence to the pignistic entropy. It does not satisfy subadditivity, but the viewpoint of hints un- covers a weaker version of subadditivity. On the other hand, the pignistic entropy has some crucial advantages over the aggregate uncertainty. i.e. explicitness of the formula and sensitivity to changes in evidence. We observe that neither of the two measures captures the full uncertainty of hints and propose an extension of the pignistic entropy called hints en- tropy that satis?es all axiomatic requirements, including subadditivity, while preserving the above advantages over the aggregate uncertainty. [less ▲] Detailed reference viewed: 102 (2 UL)Generalizing the isogeometric concept: weakening the tight coupling between geometry and simulation in IGA Tomar, Satyendra ; ; et al Presentation (2016, June 02) In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the ... [more ▼] In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the geometry representation of the domain, is also employed for the numerical solution of the problem over the domain. However, in certain situations, such as, when the geometry of the domain can be represented by low order NURBS but the numerical solution can be obtained with improved accuracy by using NURBS of order higher than that required for the geometry; or in the shape and topology optimization where the constraint of using the same space for the geometry and the numerical solution is not favorable, this tight coupling is disadvantageous. Therefore, we study the effect of decoupling the spaces for the geometry representation and the numerical solution, though still using the prevalent functions in CAD/CAGD. To begin with, we perform the patch tests on various combinations of polynomial degree, geometry type, and various cases of varying degrees and control variables between the geometry and the numerical solution. This shows that certain cases, perhaps intuitive, should be avoided in practice because patch test fails. The above-mentioned situations are further explored with some numerical examples, which shows that weakening the tight coupling between geometry and simulation offers more flexibility in choosing the numerical solution spaces. [1] J. Cottrell, T.J.R. Hughes, and Y. Bazilevs. Isogeometric Analysis: Toward Integration of CAD and FEA, volume 80. Wiley, Chichester, 2009. [2] T.J.R. Hughes, J. Cottrell, and Y. Bazilevs. Isogeometric analysis: CAD, finite elements, NURBS, exact geometry and mesh refinement. Computer Methods in Applied Mechanics and Engineering, 194:4135–4195, 2005. [less ▲] Detailed reference viewed: 194 (11 UL)Generalizing the isogeometric concept: weakening the tight coupling between geometry and simulation in IGA Bordas, Stéphane ; Tomar, Satyendra ; et al Scientific Conference (2016, May 30) In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the ... [more ▼] In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the geometry representation of the domain, is also employed for the numerical solution of the problem over the domain. However, in certain situations, such as, when the geometry of the domain can be represented by low order NURBS but the numerical solution can be obtained with improved accuracy by using NURBS of order higher than that required for the geometry; or in the shape and topology optimization where the constraint of using the same space for the geometry and the numerical solution is not favorable, this tight coupling is disadvantageous. Therefore, we study the effect of decoupling the spaces for the geometry representation and the numerical solution, though still using the prevalent functions in CAD/CAGD. To begin with, we perform the patch tests on various combinations of polynomial degree, geometry type, and various cases of varying degrees and control variables between the geometry and the numerical solution. This shows that certain cases, perhaps intuitive, should be avoided in practice because patch test fails. The above-mentioned situations are further explored with some numerical examples, which shows that weakening the tight coupling between geometry and simulation offers more flexibility in choosing the numerical solution spaces. [less ▲] Detailed reference viewed: 162 (3 UL)Generating Macroscopic, Purpose-Dependent Production Factors Through Monte Carlo Sampling Techniques Scheffer, Ariane Hélène Marie ; Cantelmo, Guido ; Viti, Francesco in Transportation Research Procedia (2017) Detailed reference viewed: 117 (22 UL)Generating purpose-dependent production factors through Monte Carlo sampling techniques. Scheffer, Ariane Hélène Marie ; Cantelmo, Guido ; Viti, Francesco Scientific Conference (2017, May) Detailed reference viewed: 93 (23 UL)Generic Inference A Unifying Theory for Automated Reasoning Pouly, Marc ; Book published by John Wiley & Sons (2011) This book provides a rigorous algebraic study of the most popular inference formalisms with a special focus on their wide application area, showing that all these tasks can be performed by a single ... [more ▼] This book provides a rigorous algebraic study of the most popular inference formalisms with a special focus on their wide application area, showing that all these tasks can be performed by a single generic inference algorithm. Written by the leading international authority on the topic, it includes an algebraic perspective (study of the valuation algebra framework), an algorithmic perspective (study of the generic inference schemes) and a "practical" perspective (formalisms and applications). Researchers in a number of fields including artificial intelligence, operational research, databases and other areas of computer science; graduate students; and professional programmers of inference methods will benefit from this work. [less ▲] Detailed reference viewed: 112 (0 UL)Generic Local Computation Pouly, Marc ; ; Report (2011) Many problems of artificial intelligence, or more generally, many problems of information processing, have a generic solution based on local computation on join trees or acyclic hypertrees. There are ... [more ▼] Many problems of artificial intelligence, or more generally, many problems of information processing, have a generic solution based on local computation on join trees or acyclic hypertrees. There are several variants of this method all based on the algebraic structure of a valuation algebra. A strong requirement underlying this approach is that the elements of a problem decomposition form a join tree. Although it is always possible to construct covering join trees, if the requirement is originally not satisfied, it is not always possible or not efficient to extend the elements of the decomposition to the covering join tree. Therefore in this paper different variants of an axiomatic framework of valuation algebras are introduced which prove sufficient for local computation without the need of an extension of the factors of a decomposition. This framework covers the axiomatic system proposed by (Shenoy & Shafer, 1990). A particular emphasis is laid on the important special cases of idempotent algebras and algebras with some notion of division. It is shown that all well-known architectures for local computation like the Shenoy-Shafer architecture, Lauritzen-Spiegelhalter and HUGIN architectures may be adapted to this new framework. Further a new architecture for idempotent algebras is presented. As examples, in addition to the classical instances of valuation algebras, semiring induced valuation algebras, Gaussian potentials and the relational algebra are presented. [less ▲] Detailed reference viewed: 106 (0 UL)Generic Solution Construction in Valuation-Based Systems Pouly, Marc in Butz, Cory; Lingras, Pawan (Eds.) Advances in Artificial Intelligence (2011) Valuation algebras abstract a large number of formalisms for automated reasoning and enable the definition of generic inference procedures. Many of these formalisms provide some notions of solutions ... [more ▼] Valuation algebras abstract a large number of formalisms for automated reasoning and enable the definition of generic inference procedures. Many of these formalisms provide some notions of solutions. Typical examples are satisfying assignments in constraint systems, models in logics or solutions to linear equation systems. Contrary to inference, there is no general algorithm to compute solutions in arbitrary valuation algebras. This paper states formal requirements for the presence of solutions and proposes a generic algorithm for solution construction based on the results of a previously executed inference scheme. We study the application of generic solution construction to semiring constraint systems, sparse linear systems and algebraic path problems and show that the proposed method generalizes various existing approaches for specific formalisms in the literature. [less ▲] Detailed reference viewed: 122 (1 UL)Genetic Algorithm based roadmapping: A method for product innovation Suzianti, Amalia Doctoral thesis (2011) Detailed reference viewed: 161 (3 UL) |
||