Browse ORBi

- What it is and what it isn't
- Green Road / Gold Road?
- Ready to Publish. Now What?
- How can I support the OA movement?
- Where can I learn more?

ORBi

Predicting near-optimal skin distance in Verlet buffer approach for Discrete Element Method Mainassara Chekaraou, Abdoul Wahid ; Besseron, Xavier ; Rousset, Alban et al in 10th IEEE Workshop on Parallel / Distributed Combinatorics and Optimization (2020, June) The Verlet list method is a well-known bookkeeping technique of the interaction list used both in Molecular Dynamic (MD) and Discrete Element Method (DEM). The Verlet buffer technique is an enhancement of ... [more ▼] The Verlet list method is a well-known bookkeeping technique of the interaction list used both in Molecular Dynamic (MD) and Discrete Element Method (DEM). The Verlet buffer technique is an enhancement of the Verlet list that consists of extending the interaction radius of each particle by an extra margin to take into account more particles in the interaction list. The extra margin is based on the local flow regime of each particle to account for the different flow regimes that can coexist in the domain. However, the choice of the near-optimal extra margin (which ensures the best performance) for each particle and the related parameters remains unexplored in DEM unlike in MD. In this study, we demonstrate that the near-optimal extra margin can fairly be characterized by four parameters that describe each particle local flow regime: the particle velocity, the ratio of the containing cell size to particle size, the containing cell solid fraction, and the total number of particles in the system. For this purpose, we model the near-optimal extra margin as a function of these parameters using a quadratic polynomial function. We use the DAKOTA SOFTWARE to carry out the Design and Analysis of Computer Experiments (DACE) and the sampling of the parameters for the simulations. For a given instance of the set of parameters, a global optimization method is considered to find the near-optimal extra margin. The latter is required for the construction of the quadratic polynomial model. The numerous simulations generated by the sampling of the parameter were performed on a High-Performance Computing (HPC) environment granting parallel and concurrent executions. This work provides a better understanding of the Verlet buffer method in DEM simulations by analyzing its performances and behavior in various configurations. The near-optimal extra margin can reasonably be predicted by two out of the four chosen parameters using the quadratic polynomial model. This model has been integrated into XDEM in order to automatically choose the extra margin without any input from the user. Evaluations on real industrial-level test cases show up to a 26% reduction of the execution time. [less ▲] Detailed reference viewed: 55 (3 UL)Evolving a Deep Neural Network Training Time Estimator Pinel, Frédéric ; ; Hundt, Christian et al in Communications in Computer and Information Science (2020, February) We present a procedure for the design of a Deep Neural Net- work (DNN) that estimates the execution time for training a deep neural network per batch on GPU accelerators. The estimator is destined to be ... [more ▼] We present a procedure for the design of a Deep Neural Net- work (DNN) that estimates the execution time for training a deep neural network per batch on GPU accelerators. The estimator is destined to be embedded in the scheduler of a shared GPU infrastructure, capable of providing estimated training times for a wide range of network architectures, when the user submits a training job. To this end, a very short and simple representation for a given DNN is chosen. In order to compensate for the limited degree of description of the basic network representation, a novel co-evolutionary approach is taken to fit the estimator. The training set for the estimator, i.e. DNNs, is evolved by an evolutionary algorithm that optimizes the accuracy of the estimator. In the process, the genetic algorithm evolves DNNs, generates Python-Keras programs and projects them onto the simple representation. The genetic operators are dynamic, they change with the estimator’s accuracy in order to balance accuracy with generalization. Results show that despite the low degree of information in the representation and the simple initial design for the predictor, co-evolving the training set performs better than near random generated population of DNNs. [less ▲] Detailed reference viewed: 79 (7 UL)Bayesian optimisation to select Rössler system parameters used in Chaotic Ant Colony Optimisation for Coverage ; Kieffer, Emmanuel ; Brust, Matthias R. et al in Journal of Computational Science (2020), 41 Detailed reference viewed: 78 (14 UL)Tackling Large-Scale and Combinatorial Bi-Level Problems With a Genetic Programming Hyper-Heuristic Kieffer, Emmanuel ; Danoy, Grégoire ; Brust, Matthias R. et al in IEEE Transactions on Evolutionary Computation (2020), 24(1), 44--56 Detailed reference viewed: 51 (11 UL)Automatic Software Tuning of Parallel Programs for Energy-Aware Executions Varrette, Sébastien ; Pinel, Frédéric ; Kieffer, Emmanuel et al in Proc. of 13th Intl. Conf. on Parallel Processing and Applied Mathematics (PPAM 2019) (2019, December) For large scale systems, such as data centers, energy efficiency has proven to be key for reducing capital, operational expenses and environmental impact. Power drainage of a system is closely related to ... [more ▼] For large scale systems, such as data centers, energy efficiency has proven to be key for reducing capital, operational expenses and environmental impact. Power drainage of a system is closely related to the type and characteristics of workload that the device is running. For this reason, this paper presents an automatic software tuning method for parallel program generation able to adapt and exploit the hardware features available on a target computing system such as an HPC facility or a cloud system in a better way than traditional compiler infrastructures. We propose a search based approach combining both exact methods and approximated heuristics evolving programs in order to find optimized configurations relying on an ever-increasing number of tunable knobs i.e., code transformation and execution options (such as the num- ber of OpenMP threads and/or the CPU frequency settings). The main objective is to outperform the configurations generated by traditional compiling infrastructures for selected KPIs i.e., performance, energy and power usage (for both for the CPU and DRAM), as well as the runtime. First experimental results tied to the local optimization phase of the proposed framework are encouraging, demonstrating between 8% and 41% improvement for all considered metrics on a reference benchmark- ing application (i.e., Linpack). This brings novel perspectives for the global optimization step currently under investigation within the presented framework, with the ambition to pave the way toward automatic tuning of energy-aware applications beyond the performance of the current state-of-the-art compiler infrastructures. [less ▲] Detailed reference viewed: 119 (28 UL)A GP Hyper-Heuristic Approach for Generating TSP Heuristics Duflo, Gabriel ; Kieffer, Emmanuel ; Brust, Matthias R. et al in 33rd IEEE International Parallel & Distributed Processing Symposium (IPDPS 2019) (2019, May 20) Detailed reference viewed: 217 (57 UL)Tackling Large-Scale and Combinatorial Bi-level Problems with a Genetic Programming Hyper-heuristic Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal et al in IEEE Transactions on Evolutionary Computation (2019) Combinatorial bi-level optimization remains a challenging topic, especially when the lower-level is a NP-hard problem. In this work, we tackle large-scale and combinatorial bi-level problems using GP ... [more ▼] Combinatorial bi-level optimization remains a challenging topic, especially when the lower-level is a NP-hard problem. In this work, we tackle large-scale and combinatorial bi-level problems using GP Hyper-heuristics, i.e., an approach that permits to train heuristics like a machine learning model. Our contribution aims at targeting the intensive and complex lower-level optimizations that occur when solving a large-scale and combinatorial bi-level problem. For this purpose, we consider hyper-heuristics through heuristic generation. Using a GP hyper-heuristic approach, we train greedy heuristics in order to make them more reliable when encountering unseen lower-level instances that could be generated during bi-level optimization. To validate our approach referred to as GA+AGH, we tackle instances from the Bi-level Cloud Pricing Optimization Problem (BCPOP) that model the trading interactions between a cloud service provider and cloud service customers. Numerical results demonstrate the abilities of the trained heuristics to cope with the inherent nested structure that makes bi-level optimization problems so hard. Furthermore, it has been shown that training heuristics for lower-level optimization permits to outperform human-based heuristics and metaheuristics which constitute an excellent outcome for bi-level optimization. [less ▲] Detailed reference viewed: 162 (37 UL)GP hyper-heuristic for the travelling salesman problem Duflo, Gabriel ; Kieffer, Emmanuel ; Danoy, Grégoire et al Scientific Conference (2019, January 29) Detailed reference viewed: 178 (38 UL)Co-evolutionary Hybrid Bi-level Optimization Kieffer, Emmanuel Doctoral thesis (2019) Multi-level optimization stems from the need to tackle complex problems involving multiple decision makers. Two-level optimization, referred as ``Bi-level optimization'', occurs when two decision makers ... [more ▼] Multi-level optimization stems from the need to tackle complex problems involving multiple decision makers. Two-level optimization, referred as ``Bi-level optimization'', occurs when two decision makers only control part of the decision variables but impact each other (e.g., objective value, feasibility). Bi-level problems are sequential by nature and can be represented as nested optimization problems in which one problem (the ``upper-level'') is constrained by another one (the ``lower-level''). The nested structure is a real obstacle that can be highly time consuming when the lower-level is $\mathcal{NP}-hard$. Consequently, classical nested optimization should be avoided. Some surrogate-based approaches have been proposed to approximate the lower-level objective value function (or variables) to reduce the number of times the lower-level is globally optimized. Unfortunately, such a methodology is not applicable for large-scale and combinatorial bi-level problems. After a deep study of theoretical properties and a survey of the existing applications being bi-level by nature, problems which can benefit from a bi-level reformulation are investigated. A first contribution of this work has been to propose a novel bi-level clustering approach. Extending the well-know ``uncapacitated k-median problem'', it has been shown that clustering can be easily modeled as a two-level optimization problem using decomposition techniques. The resulting two-level problem is then turned into a bi-level problem offering the possibility to combine distance metrics in a hierarchical manner. The novel bi-level clustering problem has a very interesting property that enable us to tackle it with classical nested approaches. Indeed, its lower-level problem can be solved in polynomial time. In cooperation with the Luxembourg Centre for Systems Biomedicine (LCSB), this new clustering model has been applied on real datasets such as disease maps (e.g. Parkinson, Alzheimer). Using a novel hybrid and parallel genetic algorithm as optimization approach, the results obtained after a campaign of experiments have the ability to produce new knowledge compared to classical clustering techniques combining distance metrics in a classical manner. The previous bi-level clustering model has the advantage that the lower-level can be solved in polynomial time although the global problem is by definition $\mathcal{NP}$-hard. Therefore, next investigations have been undertaken to tackle more general bi-level problems in which the lower-level problem does not present any specific advantageous properties. Since the lower-level problem can be very expensive to solve, the focus has been turned to surrogate-based approaches and hyper-parameter optimization techniques with the aim of approximating the lower-level problem and reduce the number of global lower-level optimizations. Adapting the well-know bayesian optimization algorithm to solve general bi-level problems, the expensive lower-level optimizations have been dramatically reduced while obtaining very accurate solutions. The resulting solutions and the number of spared lower-level optimizations have been compared to the bi-level evolutionary algorithm based on quadratic approximations (BLEAQ) results after a campaign of experiments on official bi-level benchmarks. Although both approaches are very accurate, the bi-level bayesian version required less lower-level objective function calls. Surrogate-based approaches are restricted to small-scale and continuous bi-level problems although many real applications are combinatorial by nature. As for continuous problems, a study has been performed to apply some machine learning strategies. Instead of approximating the lower-level solution value, new approximation algorithms for the discrete/combinatorial case have been designed. Using the principle employed in GP hyper-heuristics, heuristics are trained in order to tackle efficiently the $\mathcal{NP}-hard$ lower-level of bi-level problems. This automatic generation of heuristics permits to break the nested structure into two separated phases: \emph{training lower-level heuristics} and \emph{solving the upper-level problem with the new heuristics}. At this occasion, a second modeling contribution has been introduced through a novel large-scale and mixed-integer bi-level problem dealing with pricing in the cloud, i.e., the Bi-level Cloud Pricing Optimization Problem (BCPOP). After a series of experiments that consisted in training heuristics on various lower-level instances of the BCPOP and using them to tackle the bi-level problem itself, the obtained results are compared to the ``cooperative coevolutionary algorithm for bi-level optimization'' (COBRA). Although training heuristics enables to \emph{break the nested structure}, a two phase optimization is still required. Therefore, the emphasis has been put on training heuristics while optimizing the upper-level problem using competitive co-evolution. Instead of adopting the classical decomposition scheme as done by COBRA which suffers from the strong epistatic links between lower-level and upper-level variables, co-evolving the solution and the mean to get to it can cope with these epistatic link issues. The ``CARBON'' algorithm developed in this thesis is a competitive and hybrid co-evolutionary algorithm designed for this purpose. In order to validate the potential of CARBON, numerical experiments have been designed and results have been compared to state-of-the-art algorithms. These results demonstrate that ``CARBON'' makes possible to address nested optimization efficiently. [less ▲] Detailed reference viewed: 183 (25 UL)Cloud Brokering with Bundles: Multi-objective Optimization of Services Selection ; Kieffer, Emmanuel ; Guzek, Mateusz et al in Foundations of Computing and Decision Sciences (2019) Detailed reference viewed: 48 (7 UL)A Competitive Approach for Bi-Level Co-Evolution Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal et al in 2018 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW) (2018, May 25) Detailed reference viewed: 161 (15 UL)Bayesian optimization to enhance coverage performance of a swarm of UAV with chaotic dynamics Kieffer, Emmanuel ; Rosalie, Martin ; Danoy, Grégoire et al Scientific Conference (2018, February 26) We introduce the optimization of CACOC through Bayesian Optimization. CACOC is based on a chaotic system, i.e. Rossler system whose behavior can be modified by tuning the α parameter. In order to evaluate ... [more ▼] We introduce the optimization of CACOC through Bayesian Optimization. CACOC is based on a chaotic system, i.e. Rossler system whose behavior can be modified by tuning the α parameter. In order to evaluate the performance of CACOC for different value of α, the coverage metric has to be evaluated after simulation. The latter is time-consuming. Therefore, a surrogate-based optimization, i.e. Bayesian Optimization has been privilegied to tackle this issue. An analysis of the chaotic system with the obtained α value has been performed to compare the periodic orbits and their associated patterns. Numerical results show that the best α value avoid a waste of time in periodic region of the bifurcation diagram. Future works will focus on more complex chaotic system as well as new application domain of the optimized CACOC approach. [less ▲] Detailed reference viewed: 158 (30 UL)Clustering approaches for visual knowledge exploration in molecular interaction networks. Ostaszewski, Marek ; Kieffer, Emmanuel ; et al in BMC bioinformatics (2018), 19(1), 308 BACKGROUND: Biomedical knowledge grows in complexity, and becomes encoded in network-based repositories, which include focused, expert-drawn diagrams, networks of evidence-based associations and ... [more ▼] BACKGROUND: Biomedical knowledge grows in complexity, and becomes encoded in network-based repositories, which include focused, expert-drawn diagrams, networks of evidence-based associations and established ontologies. Combining these structured information sources is an important computational challenge, as large graphs are difficult to analyze visually. RESULTS: We investigate knowledge discovery in manually curated and annotated molecular interaction diagrams. To evaluate similarity of content we use: i) Euclidean distance in expert-drawn diagrams, ii) shortest path distance using the underlying network and iii) ontology-based distance. We employ clustering with these metrics used separately and in pairwise combinations. We propose a novel bi-level optimization approach together with an evolutionary algorithm for informative combination of distance metrics. We compare the enrichment of the obtained clusters between the solutions and with expert knowledge. We calculate the number of Gene and Disease Ontology terms discovered by different solutions as a measure of cluster quality. Our results show that combining distance metrics can improve clustering accuracy, based on the comparison with expert-provided clusters. Also, the performance of specific combinations of distance functions depends on the clustering depth (number of clusters). By employing bi-level optimization approach we evaluated relative importance of distance functions and we found that indeed the order by which they are combined affects clustering performance. Next, with the enrichment analysis of clustering results we found that both hierarchical and bi-level clustering schemes discovered more Gene and Disease Ontology terms than expert-provided clusters for the same knowledge repository. Moreover, bi-level clustering found more enriched terms than the best hierarchical clustering solution for three distinct distance metric combinations in three different instances of disease maps. CONCLUSIONS: In this work we examined the impact of different distance functions on clustering of a visual biomedical knowledge repository. We found that combining distance functions may be beneficial for clustering, and improve exploration of such repositories. We proposed bi-level optimization to evaluate the importance of order by which the distance functions are combined. Both combination and order of these functions affected clustering quality and knowledge recognition in the considered benchmarks. We propose that multiple dimensions can be utilized simultaneously for visual knowledge exploration. [less ▲] Detailed reference viewed: 161 (23 UL)Visualizing the Template of a Chaotic Attractor Olszewski, Maya Alexandra ; Meder, Jeff Alphonse Antoine ; Kieffer, Emmanuel et al in 26th International Symposium on Graph Drawing and Network Visualization (GD 2018) (2018) Detailed reference viewed: 179 (25 UL)Bayesian Optimization Approach of General Bi-level Problems Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal et al in Proceedings of the Genetic and Evolutionary Computation Conference Companion (2017) Detailed reference viewed: 180 (17 UL)A new modeling approach for the biobjective exact optimization of satellite payload configuration Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal et al in International Transactions in Operational Research (2017) Detailed reference viewed: 182 (18 UL)A new Co-evolutionary Algorithm Based on Constraint Decomposition Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal et al in IPDPS (2017) Detailed reference viewed: 130 (12 UL)Co-evolutionary approach based on constraint decomposition Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal et al in Co-evolutionary approach based on constraint decomposition (2016, October) Detailed reference viewed: 131 (18 UL)On Bi-level approach for Scheduling problems Kieffer, Emmanuel ; Danoy, Grégoire ; Bouvry, Pascal Scientific Conference (2016, April) Detailed reference viewed: 120 (22 UL)A Novel Co-evolutionary Approach for Constrained Genetic Algorithms Kieffer, Emmanuel ; Guzek, Mateusz ; Danoy, Grégoire et al in Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion (2016) Detailed reference viewed: 198 (24 UL) |
||