Browse ORBi

- What it is and what it isn't
- Green Road / Gold Road?
- Ready to Publish. Now What?
- How can I support the OA movement?
- Where can I learn more?

ORBi

GA4GH Passport standard for digital identity and access permissions ; ; et al in Cell Genomics (2021), 1(2), Detailed reference viewed: 35 (0 UL)A Game Theoretic Analysis of the Twitter Follow-Unfollow Mechanism ; ; Brust, Matthias R. et al in International Conference on Decision and Game Theory for Security (2018) Twitter users often crave more followers to increase their social popularity. While a variety of factors have been shown to attract the followers, very little work has been done to analyze the mechanism ... [more ▼] Twitter users often crave more followers to increase their social popularity. While a variety of factors have been shown to attract the followers, very little work has been done to analyze the mechanism how Twitter users follow or unfollow each other. In this paper, we apply game theory to modeling the follow-unfollow mechanism on Twitter. We first present a two-player game which is based on the Prisoner’s Dilemma, and subsequently evaluate the payoffs when the two players adopt different strategies. To allow two players to play multiple rounds of the game, we propose a multi-stage game model. We design a Twitter bot analyzer which follows or unfollows other Twitter users by adopting the strategies from the multi-stage game. We develop an algorithm which enables the Twitter bot analyzer to automatically collect and analyze the data. The results from analyzing the data collected in our experiment show that the follow-back ratios for both of the Twitter bots are very low, which are 0.76% and 0.86%. This means that most of the Twitter users do not cooperate and only want to be followed instead of following others. Our results also exhibit the effect of different strategies on the follow-back followers and on the non-following followers as well. [less ▲] Detailed reference viewed: 139 (0 UL)Gamifying the Commute McCall, Roderick Presentation (2014, November 14) The seminar outlined the I-GEAR project which examines the use of gamification to reduce traffic congestion. Topics included user interfaces for gamified applications, requirements capture methodologies ... [more ▼] The seminar outlined the I-GEAR project which examines the use of gamification to reduce traffic congestion. Topics included user interfaces for gamified applications, requirements capture methodologies and a sample gamified application. [less ▲] Detailed reference viewed: 183 (13 UL)Gene expression data analysis using spatiotemporal blind source separation Sainlez, Matthieu ; ; in Verleysen, Michel (Ed.) ESANN'2009 proceedings, European Symposium on Artificial Neural Networks - Advances in Computational Intelligence and Learning. (2009) We propose a “time-biased” and a “space-biased” method for spatiotemporal independent component analysis (ICA). The methods rely on computing an orthogonal approximate joint diagonalizer of a collection ... [more ▼] We propose a “time-biased” and a “space-biased” method for spatiotemporal independent component analysis (ICA). The methods rely on computing an orthogonal approximate joint diagonalizer of a collection of covariance-like matrices. In the time-biased version, the time signatures of the ICA modes are imposed to be white, whereas the space-biased version imposes the same condition on the space signatures. We apply the two methods to the analysis of gene expression data, where the genes play the role of the space and the cell samples stand for the time. This study is a step towards addressing a question first raised by Liebermeister, on whether ICA methods for gene expression analysis should impose independence across genes or across cell samples. Our preliminary experiment indicates that both approaches have value, and that exploring the continuum between these two extremes can provide useful information about the interactions between genes and their impact on the phenotype. [less ▲] Detailed reference viewed: 29 (0 UL)Gene Prioritization in the Epilepsies Chatterjee, Sreyoshi Doctoral thesis (2022) Detailed reference viewed: 27 (5 UL)Generalising from conventional pipelines using deep learning in high‑throughput screening workfows Garcia Santa Cruz, Beatriz ; ; Gomez Giro, Gemma et al in Scientific Reports (2022) The study of complex diseases relies on large amounts of data to build models toward precision medicine. Such data acquisition is feasible in the context of high-throughput screening, in which the quality ... [more ▼] The study of complex diseases relies on large amounts of data to build models toward precision medicine. Such data acquisition is feasible in the context of high-throughput screening, in which the quality of the results relies on the accuracy of the image analysis. Although state-of-the-art solutions for image segmentation employ deep learning approaches, the high cost of manually generating ground truth labels for model training hampers the day-to-day application in experimental laboratories. Alternatively, traditional computer vision-based solutions do not need expensive labels for their implementation. Our work combines both approaches by training a deep learning network using weak training labels automatically generated with conventional computer vision methods. Our network surpasses the conventional segmentation quality by generalising beyond noisy labels, providing a 25% increase of mean intersection over union, and simultaneously reducing the development and inference times. Our solution was embedded into an easy-to-use graphical user interface that allows researchers to assess the predictions and correct potential inaccuracies with minimal human input. To demonstrate the feasibility of training a deep learning solution on a large dataset of noisy labels automatically generated by a conventional pipeline, we compared our solution against the common approach of training a model from a small manually curated dataset by several experts. Our work suggests that humans perform better in context interpretation, such as error assessment, while computers outperform in pixel-by-pixel fne segmentation. Such pipelines are illustrated with a case study on image segmentation for autophagy events. This work aims for better translation of new technologies to real-world settings in microscopy-image analysis. [less ▲] Detailed reference viewed: 213 (18 UL)Generalization of the primary tone phase variation method: An exclusive way of isolating the frequency-following response components Lucchetti, Federico ; ; et al in The Journal of the Acoustical Society of America (2018) Detailed reference viewed: 16 (0 UL)Generalized Information Theory based on the Theory of Hints Pouly, Marc in Liu, Weiru (Ed.) Symbolic and Quantitative Approaches to Reasoning with Uncertainty (2011) The aggregate uncertainty is the only known functional for Dempster-Shafer theory that generalizes the Shannon and Hartley mea- sures and satis?es all classical requirements for uncertainty measures ... [more ▼] The aggregate uncertainty is the only known functional for Dempster-Shafer theory that generalizes the Shannon and Hartley mea- sures and satis?es all classical requirements for uncertainty measures, including subadditivity. Although being posed several times in the liter- ature, it is still an open problem whether the aggregate uncertainty is unique under these properties. This paper derives an uncertainty measure based on the theory of hints and shows its equivalence to the pignistic entropy. It does not satisfy subadditivity, but the viewpoint of hints un- covers a weaker version of subadditivity. On the other hand, the pignistic entropy has some crucial advantages over the aggregate uncertainty. i.e. explicitness of the formula and sensitivity to changes in evidence. We observe that neither of the two measures captures the full uncertainty of hints and propose an extension of the pignistic entropy called hints en- tropy that satis?es all axiomatic requirements, including subadditivity, while preserving the above advantages over the aggregate uncertainty. [less ▲] Detailed reference viewed: 102 (2 UL)Generalizing the isogeometric concept: weakening the tight coupling between geometry and simulation in IGA Tomar, Satyendra ; ; et al Presentation (2016, June 02) In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the ... [more ▼] In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the geometry representation of the domain, is also employed for the numerical solution of the problem over the domain. However, in certain situations, such as, when the geometry of the domain can be represented by low order NURBS but the numerical solution can be obtained with improved accuracy by using NURBS of order higher than that required for the geometry; or in the shape and topology optimization where the constraint of using the same space for the geometry and the numerical solution is not favorable, this tight coupling is disadvantageous. Therefore, we study the effect of decoupling the spaces for the geometry representation and the numerical solution, though still using the prevalent functions in CAD/CAGD. To begin with, we perform the patch tests on various combinations of polynomial degree, geometry type, and various cases of varying degrees and control variables between the geometry and the numerical solution. This shows that certain cases, perhaps intuitive, should be avoided in practice because patch test fails. The above-mentioned situations are further explored with some numerical examples, which shows that weakening the tight coupling between geometry and simulation offers more flexibility in choosing the numerical solution spaces. [1] J. Cottrell, T.J.R. Hughes, and Y. Bazilevs. Isogeometric Analysis: Toward Integration of CAD and FEA, volume 80. Wiley, Chichester, 2009. [2] T.J.R. Hughes, J. Cottrell, and Y. Bazilevs. Isogeometric analysis: CAD, finite elements, NURBS, exact geometry and mesh refinement. Computer Methods in Applied Mechanics and Engineering, 194:4135–4195, 2005. [less ▲] Detailed reference viewed: 194 (11 UL)Generalizing the isogeometric concept: weakening the tight coupling between geometry and simulation in IGA Bordas, Stéphane ; Tomar, Satyendra ; et al Scientific Conference (2016, May 30) In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the ... [more ▼] In the standard paradigm of isogeometric analysis [2, 1], the geometry and the simulation spaces are tightly integrated, i.e. the non-uniform rational B-splines (NURBS) space, which is used for the geometry representation of the domain, is also employed for the numerical solution of the problem over the domain. However, in certain situations, such as, when the geometry of the domain can be represented by low order NURBS but the numerical solution can be obtained with improved accuracy by using NURBS of order higher than that required for the geometry; or in the shape and topology optimization where the constraint of using the same space for the geometry and the numerical solution is not favorable, this tight coupling is disadvantageous. Therefore, we study the effect of decoupling the spaces for the geometry representation and the numerical solution, though still using the prevalent functions in CAD/CAGD. To begin with, we perform the patch tests on various combinations of polynomial degree, geometry type, and various cases of varying degrees and control variables between the geometry and the numerical solution. This shows that certain cases, perhaps intuitive, should be avoided in practice because patch test fails. The above-mentioned situations are further explored with some numerical examples, which shows that weakening the tight coupling between geometry and simulation offers more flexibility in choosing the numerical solution spaces. [less ▲] Detailed reference viewed: 162 (3 UL)Generating Macroscopic, Purpose-Dependent Production Factors Through Monte Carlo Sampling Techniques Scheffer, Ariane Hélène Marie ; Cantelmo, Guido ; Viti, Francesco in Transportation Research Procedia (2017) Detailed reference viewed: 117 (22 UL)Generating purpose-dependent production factors through Monte Carlo sampling techniques. Scheffer, Ariane Hélène Marie ; Cantelmo, Guido ; Viti, Francesco Scientific Conference (2017, May) Detailed reference viewed: 93 (23 UL)Generic Inference A Unifying Theory for Automated Reasoning Pouly, Marc ; Book published by John Wiley & Sons (2011) This book provides a rigorous algebraic study of the most popular inference formalisms with a special focus on their wide application area, showing that all these tasks can be performed by a single ... [more ▼] This book provides a rigorous algebraic study of the most popular inference formalisms with a special focus on their wide application area, showing that all these tasks can be performed by a single generic inference algorithm. Written by the leading international authority on the topic, it includes an algebraic perspective (study of the valuation algebra framework), an algorithmic perspective (study of the generic inference schemes) and a "practical" perspective (formalisms and applications). Researchers in a number of fields including artificial intelligence, operational research, databases and other areas of computer science; graduate students; and professional programmers of inference methods will benefit from this work. [less ▲] Detailed reference viewed: 112 (0 UL)Generic Local Computation Pouly, Marc ; ; Report (2011) Many problems of artificial intelligence, or more generally, many problems of information processing, have a generic solution based on local computation on join trees or acyclic hypertrees. There are ... [more ▼] Many problems of artificial intelligence, or more generally, many problems of information processing, have a generic solution based on local computation on join trees or acyclic hypertrees. There are several variants of this method all based on the algebraic structure of a valuation algebra. A strong requirement underlying this approach is that the elements of a problem decomposition form a join tree. Although it is always possible to construct covering join trees, if the requirement is originally not satisfied, it is not always possible or not efficient to extend the elements of the decomposition to the covering join tree. Therefore in this paper different variants of an axiomatic framework of valuation algebras are introduced which prove sufficient for local computation without the need of an extension of the factors of a decomposition. This framework covers the axiomatic system proposed by (Shenoy & Shafer, 1990). A particular emphasis is laid on the important special cases of idempotent algebras and algebras with some notion of division. It is shown that all well-known architectures for local computation like the Shenoy-Shafer architecture, Lauritzen-Spiegelhalter and HUGIN architectures may be adapted to this new framework. Further a new architecture for idempotent algebras is presented. As examples, in addition to the classical instances of valuation algebras, semiring induced valuation algebras, Gaussian potentials and the relational algebra are presented. [less ▲] Detailed reference viewed: 106 (0 UL)Generic Solution Construction in Valuation-Based Systems Pouly, Marc in Butz, Cory; Lingras, Pawan (Eds.) Advances in Artificial Intelligence (2011) Valuation algebras abstract a large number of formalisms for automated reasoning and enable the definition of generic inference procedures. Many of these formalisms provide some notions of solutions ... [more ▼] Valuation algebras abstract a large number of formalisms for automated reasoning and enable the definition of generic inference procedures. Many of these formalisms provide some notions of solutions. Typical examples are satisfying assignments in constraint systems, models in logics or solutions to linear equation systems. Contrary to inference, there is no general algorithm to compute solutions in arbitrary valuation algebras. This paper states formal requirements for the presence of solutions and proposes a generic algorithm for solution construction based on the results of a previously executed inference scheme. We study the application of generic solution construction to semiring constraint systems, sparse linear systems and algebraic path problems and show that the proposed method generalizes various existing approaches for specific formalisms in the literature. [less ▲] Detailed reference viewed: 122 (1 UL)Genetic Algorithm based roadmapping: A method for product innovation Suzianti, Amalia Doctoral thesis (2011) Detailed reference viewed: 161 (3 UL)The Genetics Lab: An Innovative Tool for Assessment of Intelligence by Mean of Complex Problem Solving ; ; et al Scientific Conference (2011) Detailed reference viewed: 119 (0 UL)A Geographical Analysis of Bicycle Sharing Systems Médard de Chardon, Cyrille Doctoral thesis (2016) This thesis evaluates the performance of bicycle sharing systems (BSS), autonomous systems of accessible bicycles that can be easily used for one way trips, and determines whether they are successful at ... [more ▼] This thesis evaluates the performance of bicycle sharing systems (BSS), autonomous systems of accessible bicycles that can be easily used for one way trips, and determines whether they are successful at achieving promoted social and environmental outcomes through quantitative and qualitative methods. Such systems are typically surrounded by positive narratives of success, health, environmental and social benefits. This work challenges these notions. This thesis begins with the formalisation of BSS station level and trip data revealing alternative data contained within. Combined with spatiotemporal data analysis, this allows the estimation of trips, a potential measure of success. Due to most operators not providing consistent or comparable metrics of usage this work opens this heavily promoted technological transport innovation’s performance for public scrutiny. Performance estimates of 75 case studies show a majority having less than two trips per day per bicycle, suggesting a poor investment, regardless of existing social justice issues and exaggerated environmental benefits. Using this metric this work determines which attributes impact performance. While station density and cycling infrastructure, among others, are found to impact performance, results challenge promoted practice. Formalisation yielded rebalancing, the moving of bicycles to adjust to demand exceeding supply. Spatiotemporal data analysis and interviews with operators provides the first description of applied rebalancing, providing an alternative perspective to the many theoretical optimisation models. Results show rebalancing is spatially selective and influencing BSS outcomes, potentially contrary to its purpose. Finally, this thesis, through a critical urban sustainability perspective, presents darker aspects of BSS, beyond the golden narratives, showing conflicts of interest, controversy and the commercialisation of an initially environmental and anti-consumerism concept. Existential questions are raised due to BSS, mostly privately operated, providing benefits to an already advantaged class while public space is privatised and urban advertising increased. This work concludes by suggesting that alternative investment to bicycle sharing systems, such as cycling infrastructure, may be more beneficial and just. [less ▲] Detailed reference viewed: 512 (26 UL)Geographical Modelling with Cellular-Automata Caruso, Geoffrey Presentation (2009) Detailed reference viewed: 70 (0 UL)Geometrical and material uncertainties for the mechanics of composites ; Bordas, Stéphane ; et al Scientific Conference (2019) Detailed reference viewed: 146 (15 UL) |
||