![]() Vlassis, Nikos ![]() in IEEE Transactions on Neural Networks (2001), 12(3), 559-566 A basic element in most independent component analysis (ICA) algorithms is the choice of a model for the score functions of the unknown sources. While this is usually based on approximations, for large ... [more ▼] A basic element in most independent component analysis (ICA) algorithms is the choice of a model for the score functions of the unknown sources. While this is usually based on approximations, for large data sets it is possible to achieve "source adaptivity" by directly estimating from the data the "'true" score functions of the sources. In this paper we describe an efficient scheme for achieving this by extending the fast density estimation method of Silverman (1982), We show with a real and a synthetic experiment that our method can provide more accurate solutions than state-of-the-art methods when optimization is carried out in the vicinity of the global minimum of the contrast function. [less ▲] Detailed reference viewed: 64 (1 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. IEEE Int. Conf. on Robotics and Automation (2001) Detailed reference viewed: 53 (0 UL)![]() Vlassis, Nikos ![]() in Proc. IEEE Int. Conf. on Robotics and Automation (2000) We are seeking linear projections of supervised high-dimensional robot observations and an appropriate environment model that optimize the robot localization task. We show that an appropriate risk ... [more ▼] We are seeking linear projections of supervised high-dimensional robot observations and an appropriate environment model that optimize the robot localization task. We show that an appropriate risk function to minimize is the conditional entropy of the robot positions given the projected observations. We propose a method of iterative optimization through a probabilistic model based on kernel smoothing. To obtain good starting optimization solutions we use canonical correlation analysis. We apply our method on a real experiment involving a mobile robot equipped with an omnidirectional camera in an office setup. [less ▲] Detailed reference viewed: 77 (0 UL)![]() Vlassis, Nikos ![]() in Proc. 9th Int. Conf. on Artificial Neural Networks (1999) Detailed reference viewed: 84 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Neural Processing Letters (1999), 9(1), 63-76 Detailed reference viewed: 30 (0 UL)![]() ![]() ; ; Vlassis, Nikos ![]() in Proc. IJCAI'99, 16th Int. Joint Conf. on Artificial Intelligence, ROB-2 Workshop (1999) Detailed reference viewed: 25 (0 UL)![]() Vlassis, Nikos ![]() in Proc. of Intelligent Robots and Systems, International Conference on (1999) A key issue in mobile robot applications involves building a map of the environment to be used by the robot for localization and path planning. We propose a framework for robot map building which is based ... [more ▼] A key issue in mobile robot applications involves building a map of the environment to be used by the robot for localization and path planning. We propose a framework for robot map building which is based on principal component regression, a statistical method for extracting low-dimensional dependencies between a set of input and target values. A supervised set of robot positions (inputs) and associated high-dimensional sensor measurements (targets) are assumed. A set of globally uncorrelated features of the original sensor measurements are obtained by applying principal component analysis on the target set. A parametrized model of the conditional density function of the sensor features given the robot positions is built based on an unbiased estimation procedure that fits interpolants for both the mean and the variance of each feature independently. The simulation results show that the average Bayesian localization error is an increasing function of the principal component index. [less ▲] Detailed reference viewed: 83 (0 UL)![]() ![]() Vlassis, Nikos ![]() in IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans (1999), 29(4), 393-399 We address the problem of probability density function estimation using a Gaussian mixture model updated with the expectation-maximization (EM) algorithm. To deal with the case of an unknown number of ... [more ▼] We address the problem of probability density function estimation using a Gaussian mixture model updated with the expectation-maximization (EM) algorithm. To deal with the case of an unknown number of mixing kernels, we define a new measure for Gaussian mixtures, called total kurtosis, which is based on the weighted sample kurtoses of the kernels. This measure provides an indication of how well the Gaussian mixture fits the data. Then we propose a new dynamic algorithm for Gaussian mixture density estimation which monitors the total kurtosis at each step of the Ehl algorithm in order to decide dynamically on the correct number of kernels and possibly escape from local maxima. We show the potential of our technique in approximating unknown densities through a series of examples with several density estimation problems. [less ▲] Detailed reference viewed: 76 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. ACAI'99, Int. Conf. on Machine Learning and Applications (1999) Detailed reference viewed: 26 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (1998) Detailed reference viewed: 70 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. IEEE Int. Conf. on Robotics and Automation (1998) Detailed reference viewed: 65 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. Int. Conf. on Artificial Neural Networks (1997) Detailed reference viewed: 95 (0 UL)![]() ![]() ; Vlassis, Nikos ![]() in Journal of Intelligent & Robotic Systems (1996), 16(2), 169-184 Detailed reference viewed: 90 (0 UL)![]() ![]() Vlassis, Nikos ![]() in Proc. 8th IEEE Int. Conf. on Tools with AI (1996) Detailed reference viewed: 82 (0 UL) |
||