![]() ![]() ; ; et al in IEEE Robotics and Automation Magazine (2004), 11(4), 45-52 Detailed reference viewed: 92 (1 UL)![]() ; Vlassis, Nikos ![]() in Neural Computation (2003), 15(2), 469-485 This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the ... [more ▼] This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the optimal component to insert. In a randomized manner, a set of candidate new components is generated. For each of these candidates, we find the locally optimal new component and insert it into the existing mixture. The resulting algorithm resolves the sensitivity to initialization of state-of-the-art methods, like expectation maximization, and has running time linear in the number of data points and quadratic in the (final) number of mixture components. Due to its greedy nature, the algorithm can be particularly useful when the optimal number of mixture components is unknown. Experimental results comparing the proposed algorithm to other methods on density estimation and texture segmentation are provided. [less ▲] Detailed reference viewed: 83 (0 UL)![]() ![]() ; Vlassis, Nikos ![]() in Proc. Europ. Symp. on Artificial Neural Networks (2002) Detailed reference viewed: 38 (0 UL)![]() Vlassis, Nikos ![]() in Neural Computation (2002), 14(1), 191-215 High-dimensional data generated by a system with limited degrees of freedom are often constrained in low-dimensional manifolds in the original space. In this article, we investigate dimension-reduction ... [more ▼] High-dimensional data generated by a system with limited degrees of freedom are often constrained in low-dimensional manifolds in the original space. In this article, we investigate dimension-reduction methods for such intrinsically low-dimensional data through linear projections that preserve the manifold structure of the data. For intrinsically one-dimensional data, this implies projecting to a curve on the plane with as few intersections as possible. We are proposing a supervised projection pursuit method that can be regarded as an extension of the single-index model for nonparametric regression. We show results from a toy and two robotic applications. [less ▲] Detailed reference viewed: 100 (0 UL)![]() ![]() ; Vlassis, Nikos ![]() in Proceedings of the Tenth European Symposium on Artificial Neural Networks (2002) We present a fast alternative for the Isomap algorithm. A set of quantizers is fit to the data and a neighborhood structure based on the competitive Hebbian rule is imposed on it. This structure is used ... [more ▼] We present a fast alternative for the Isomap algorithm. A set of quantizers is fit to the data and a neighborhood structure based on the competitive Hebbian rule is imposed on it. This structure is used to obtain low-dimensional description of the data by means of computing geodesic distances and multi dimensional scaling. The quantization allows for faster processing of the data. The speed-up as compared to Isomap is roughly quadratic in the ratio between the number of quantizers and the number of data points. [less ▲] Detailed reference viewed: 45 (0 UL)![]() Vlassis, Nikos ![]() in IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS (2002) We apply the auxiliary particle filter algorithm of Pitt and Shephard (1999) to the problem of robot localization. To deal with the high-dimensional sensor observations (images) and an unknown observation ... [more ▼] We apply the auxiliary particle filter algorithm of Pitt and Shephard (1999) to the problem of robot localization. To deal with the high-dimensional sensor observations (images) and an unknown observation model., we propose the use of an inverted nonparametric observation model computed by nearest neighbor conditional density estimation. We show that the proposed model can lead to a fully adapted optimal filter, and is able to successfully handle image occlusion and robot kidnap. The proposed algorithm is very simple to implement and exhibits a high degree of robustness in practice. We report experiments involving robot localization from omnidirectional vision in an indoor environment. [less ▲] Detailed reference viewed: 173 (1 UL)![]() ; Vlassis, Nikos ![]() in Pattern Recognition Letters (2002), 23(8), 1009-1017 We propose an incremental method to find principal curves. Line segments are fitted and connected to form polygonal lines. New segments are inserted until a performance criterion is met. Experimental ... [more ▼] We propose an incremental method to find principal curves. Line segments are fitted and connected to form polygonal lines. New segments are inserted until a performance criterion is met. Experimental results illustrate the performance of the method compared to other existing approaches. [less ▲] Detailed reference viewed: 208 (0 UL)![]() ![]() ; Vlassis, Nikos ![]() in ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS (2001) We propose a new method to find principal curves for data sets. The method repeats three steps until a stopping criterion is met. In the first step, k (unconnected) line segments are fitted on the data ... [more ▼] We propose a new method to find principal curves for data sets. The method repeats three steps until a stopping criterion is met. In the first step, k (unconnected) line segments are fitted on the data. The second step connects the segments to form a polygonal line, and evaluates the quality of the resulting polygonal line. The third step inserts a new line segment. We compare the performance of our new method with other existing methods to find principal curves. [less ▲] Detailed reference viewed: 106 (0 UL)![]() Vlassis, Nikos ![]() in Proc. IEEE Int. Conf. on Robotics and Automation (2000) We are seeking linear projections of supervised high-dimensional robot observations and an appropriate environment model that optimize the robot localization task. We show that an appropriate risk ... [more ▼] We are seeking linear projections of supervised high-dimensional robot observations and an appropriate environment model that optimize the robot localization task. We show that an appropriate risk function to minimize is the conditional entropy of the robot positions given the projected observations. We propose a method of iterative optimization through a probabilistic model based on kernel smoothing. To obtain good starting optimization solutions we use canonical correlation analysis. We apply our method on a real experiment involving a mobile robot equipped with an omnidirectional camera in an office setup. [less ▲] Detailed reference viewed: 97 (0 UL)![]() Vlassis, Nikos ![]() in Proc. of Intelligent Robots and Systems, International Conference on (1999) A key issue in mobile robot applications involves building a map of the environment to be used by the robot for localization and path planning. We propose a framework for robot map building which is based ... [more ▼] A key issue in mobile robot applications involves building a map of the environment to be used by the robot for localization and path planning. We propose a framework for robot map building which is based on principal component regression, a statistical method for extracting low-dimensional dependencies between a set of input and target values. A supervised set of robot positions (inputs) and associated high-dimensional sensor measurements (targets) are assumed. A set of globally uncorrelated features of the original sensor measurements are obtained by applying principal component analysis on the target set. A parametrized model of the conditional density function of the sensor features given the robot positions is built based on an unbiased estimation procedure that fits interpolants for both the mean and the variance of each feature independently. The simulation results show that the average Bayesian localization error is an increasing function of the principal component index. [less ▲] Detailed reference viewed: 107 (0 UL)![]() ![]() ; ; Vlassis, Nikos ![]() in Proc. IJCAI'99, 16th Int. Joint Conf. on Artificial Intelligence, ROB-2 Workshop (1999) Detailed reference viewed: 35 (0 UL)![]() Vlassis, Nikos ![]() in Proc. 9th Int. Conf. on Artificial Neural Networks (1999) Detailed reference viewed: 98 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. ACAI'99, Int. Conf. on Machine Learning and Applications (1999) Detailed reference viewed: 37 (0 UL) |
||