Browse ORBi

- What it is and what it isn't
- Green Road / Gold Road?
- Ready to Publish. Now What?
- How can I support the OA movement?
- Where can I learn more?

ORBi

Anytime Algorithms for Multiagent Decision Making Using Coordination Graphs Vlassis, Nikos ; ; in Proceedings of the International Conference on Systems, Man and Cybernetics (2004) Coordinationgraphsprovideatractableframe- work for cooperative multiagent decision making by decom- posing the global payoff function into a sum of local terms. In this paper we review some distributed ... [more ▼] Coordinationgraphsprovideatractableframe- work for cooperative multiagent decision making by decom- posing the global payoff function into a sum of local terms. In this paper we review some distributed algorithms for ac- tion selection in a coordination graph and discuss their pros and cons. For real-time decision making we emphasize the need for anytime algorithms for action selection: these are algorithms that improve the quality of the solution over time. We describe variable elimination, coordinate ascent, and the max-plus algorithm, the latter being an instance of the be- lief propagation algorithm in Bayesian networks. We discuss some interesting open problems related to the use of the max- plus algorithm in real-time multiagent decision making. [less ▲] Detailed reference viewed: 85 (0 UL)Skin detection using the EM algorithm with spatial constraints ; ; Vlassis, Nikos in Proc. Int. Conf. on Systems, Man and Cybernetics (2004) In this paper, we propose a color-based method for skin detection and segmentation, which also takes into account the spatial coherence of the skin pixels. We treat the problem of skin detection as an ... [more ▼] In this paper, we propose a color-based method for skin detection and segmentation, which also takes into account the spatial coherence of the skin pixels. We treat the problem of skin detection as an inference problem. We assume that each pixel in an image has a hidden binary label associated with it, that specifies if it is skin or not. In order to solve the inference problem, we use a variational EM algorithm, which incorporates the spatial constraints with just a small computational overhead in the E-step. Finally, we show that our method provides better results than the standard EM algorithm and a state-of-art skin-detection method from the literature of Jones, M. J. and Rehg, J. M. (2002). [less ▲] Detailed reference viewed: 76 (0 UL)Household robots look and learn ; ; et al in IEEE Robotics & Automation Magazine (2004), 11(4), 45-52 Detailed reference viewed: 75 (0 UL)A point-based POMDP algorithm for robot planning ; Vlassis, Nikos in Proc. IEEE Int. Conf. on Robotics and Automation, New Orleans, Louisiana (2004) We present an approximate POMDP solution method for robot planning in partially observable environments. Our algorithm belongs to the family of point-based value iteration solution techniques for POMDPs ... [more ▼] We present an approximate POMDP solution method for robot planning in partially observable environments. Our algorithm belongs to the family of point-based value iteration solution techniques for POMDPs, in which planning is performed only on a sampled set of reachable belief points. We describe a simple, randomized procedure that performs value update steps that strictly improve the value of all belief points in each step. We demonstrate our algorithm on a robotic delivery task in an office environment and on several benchmark problems, for which we compute solutions that are very competitive to those of state-ofthe -art methods in terms of speed and solution quality. [less ▲] Detailed reference viewed: 99 (0 UL)Self-Organization by Optimizing Free-Energy ; Vlassis, Nikos ; in Proc. of European Symposium on Artificial Neural Networks (2003) We present a variational Expectation-Maximization algorithm to learn probabilistic mixture models. The algorithm is similar to Kohonen's Self-Organizing Map algorithm and not limited to Gaussian mixtures ... [more ▼] We present a variational Expectation-Maximization algorithm to learn probabilistic mixture models. The algorithm is similar to Kohonen's Self-Organizing Map algorithm and not limited to Gaussian mixtures. We maximize the variational free-energy that sums data log-likelihood and Kullback-Leibler divergence between a normalized neighborhood function and the posterior distribution on the components, given data. We illustrate the algorithm with an application on word clustering. [less ▲] Detailed reference viewed: 53 (0 UL)Efficient Greedy Learning of Gaussian Mixture Models ; Vlassis, Nikos ; in Neural Computation (2003), 15(2), 469-485 This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the ... [more ▼] This article concerns the greedy learning of gaussian mixtures. In the greedy approach, mixture components are inserted into the mixture one aftertheother.We propose a heuristic for searching for the optimal component to insert. In a randomized manner, a set of candidate new components is generated. For each of these candidates, we find the locally optimal new component and insert it into the existing mixture. The resulting algorithm resolves the sensitivity to initialization of state-of-the-art methods, like expectation maximization, and has running time linear in the number of data points and quadratic in the (final) number of mixture components. Due to its greedy nature, the algorithm can be particularly useful when the optimal number of mixture components is unknown. Experimental results comparing the proposed algorithm to other methods on density estimation and texture segmentation are provided. [less ▲] Detailed reference viewed: 65 (0 UL)The global k-means clustering algorithm ; Vlassis, Nikos ; in Pattern Recognition (2003), 36(2), 451-461 We present the global k-means algorithm which is an incremental approach to clustering that dynamically adds one cluster center at a time through a deterministic global search procedure consisting of N ... [more ▼] We present the global k-means algorithm which is an incremental approach to clustering that dynamically adds one cluster center at a time through a deterministic global search procedure consisting of N (with N being the size of the data set) executions of the k-means algorithm from suitable initial positions. We also propose modifications of the method to reduce the computational load without significantly affecting solution quality. The proposed clustering methods are tested on well-known data sets and they compare favorably to the k-means algorithm with random restarts. [less ▲] Detailed reference viewed: 207 (1 UL)Multi-Robot Decision Making Using Coordination Graphs ; ; Vlassis, Nikos in Proceedings of the International Conference on Advanced Robotics (ICAR) (2003) Within a group of cooperating agents the decision making of an individual agent depends on the actions of the other agents. In dynamic environments, these dependencies will change rapidly as a result of ... [more ▼] Within a group of cooperating agents the decision making of an individual agent depends on the actions of the other agents. In dynamic environments, these dependencies will change rapidly as a result of the continuously changing state. Via a context-specific decomposition of the problem into smaller subproblems, coordination graphs o#er scalable solutions to the problem of multiagent decision making. We will apply coordination graphs to the continuous domain by assigning roles to the agents and then coordinating the di#erent roles. Finally, we will demonstrate this method in the RoboCup soccer simulation domain. [less ▲] Detailed reference viewed: 47 (0 UL)Fast Nonlinear Dimensionality Reduction With Topology Preserving Networks ; Vlassis, Nikos ; in Proceedings of the Tenth European Symposium on Artificial Neural Networks (2002) We present a fast alternative for the Isomap algorithm. A set of quantizers is fit to the data and a neighborhood structure based on the competitive Hebbian rule is imposed on it. This structure is used ... [more ▼] We present a fast alternative for the Isomap algorithm. A set of quantizers is fit to the data and a neighborhood structure based on the competitive Hebbian rule is imposed on it. This structure is used to obtain low-dimensional description of the data by means of computing geodesic distances and multi dimensional scaling. The quantization allows for faster processing of the data. The speed-up as compared to Isomap is roughly quadratic in the ratio between the number of quantizers and the number of data points. [less ▲] Detailed reference viewed: 25 (0 UL)A k-segments algorithm for finding principal curves ; Vlassis, Nikos ; in Pattern Recognition Letters (2002), 23(8), 1009-1017 We propose an incremental method to find principal curves. Line segments are fitted and connected to form polygonal lines. New segments are inserted until a performance criterion is met. Experimental ... [more ▼] We propose an incremental method to find principal curves. Line segments are fitted and connected to form polygonal lines. New segments are inserted until a performance criterion is met. Experimental results illustrate the performance of the method compared to other existing approaches. [less ▲] Detailed reference viewed: 177 (0 UL)Fast nonlinear dimensionality reduction with topology representing networks ; Vlassis, Nikos ; in Proc. Europ. Symp. on Artificial Neural Networks (2002) Detailed reference viewed: 28 (0 UL)Supervised dimension reduction of intrinsically low-dimensional data Vlassis, Nikos ; ; in Neural Computation (2002), 14(1), 191-215 High-dimensional data generated by a system with limited degrees of freedom are often constrained in low-dimensional manifolds in the original space. In this article, we investigate dimension-reduction ... [more ▼] High-dimensional data generated by a system with limited degrees of freedom are often constrained in low-dimensional manifolds in the original space. In this article, we investigate dimension-reduction methods for such intrinsically low-dimensional data through linear projections that preserve the manifold structure of the data. For intrinsically one-dimensional data, this implies projecting to a curve on the plane with as few intersections as possible. We are proposing a supervised projection pursuit method that can be regarded as an extension of the single-index model for nonparametric regression. We show results from a toy and two robotic applications. [less ▲] Detailed reference viewed: 80 (0 UL)Coordinating Principal Component Analyzers ; Vlassis, Nikos ; in Proc. Int. Conf. on Artificial Neural Networks, Madrid, Spain, (2002) Mixtures of Principal Component Analyzers can be used to model high dimensional data that lie on or near a low dimensional manifold. By linearly mapping the PCA subspaces to one global low dimensional ... [more ▼] Mixtures of Principal Component Analyzers can be used to model high dimensional data that lie on or near a low dimensional manifold. By linearly mapping the PCA subspaces to one global low dimensional space, we obtain a `global' low dimensional coordinate system for the data. As shown by Roweis et al., ensuring consistent global low-dimensional coordinates for the data can be expressed as a penalized likelihood optimization problem. We show that a restricted form of the Mixtures of Probabilistic PCA model allows for a more efficient algorithm. Experimental results are provided to illustrate the viability method. [less ▲] Detailed reference viewed: 69 (0 UL)A greedy EM algorithm for Gaussian mixture learning Vlassis, Nikos ; in Neural Processing Letters (2002), 15(1), 77-87 Learning a Gaussian mixture with a local algorithm like EM can be difficult because (i) the true number of mixing components is usually unknown, (ii) there is no generally accepted method for parameter ... [more ▼] Learning a Gaussian mixture with a local algorithm like EM can be difficult because (i) the true number of mixing components is usually unknown, (ii) there is no generally accepted method for parameter initialization, and (iii) the algorithm can get trapped in one of the many local maxima of the likelihood function. In this paper we propose a greedy algorithm for learning a Gaussian mixture which tries to overcome these limitations. In particular, starting with a single component and adding components sequentially until a maximum number k, the algorithm is capable of achieving solutions superior to EM with k components in terms of the likelihood of a test set. The algorithm is based on recent theoretical results on incremental mixture density estimation, and uses a combination of global and local search each time a new component is added to the mixture. [less ▲] Detailed reference viewed: 48 (0 UL)Auxiliary particle filter robot localization from high-dimensional sensor observations Vlassis, Nikos ; ; in IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS I-IV, PROCEEDINGS (2002) We apply the auxiliary particle filter algorithm of Pitt and Shephard (1999) to the problem of robot localization. To deal with the high-dimensional sensor observations (images) and an unknown observation ... [more ▼] We apply the auxiliary particle filter algorithm of Pitt and Shephard (1999) to the problem of robot localization. To deal with the high-dimensional sensor observations (images) and an unknown observation model., we propose the use of an inverted nonparametric observation model computed by nearest neighbor conditional density estimation. We show that the proposed model can lead to a fully adapted optimal filter, and is able to successfully handle image occlusion and robot kidnap. The proposed algorithm is very simple to implement and exhibits a high degree of robustness in practice. We report experiments involving robot localization from omnidirectional vision in an indoor environment. [less ▲] Detailed reference viewed: 156 (1 UL)A soft k-segments algorithm for principal curves ; Vlassis, Nikos ; in ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS (2001) We propose a new method to find principal curves for data sets. The method repeats three steps until a stopping criterion is met. In the first step, k (unconnected) line segments are fitted on the data ... [more ▼] We propose a new method to find principal curves for data sets. The method repeats three steps until a stopping criterion is met. In the first step, k (unconnected) line segments are fitted on the data. The second step connects the segments to form a polygonal line, and evaluates the quality of the resulting polygonal line. The third step inserts a new line segment. We compare the performance of our new method with other existing methods to find principal curves. [less ▲] Detailed reference viewed: 85 (0 UL)Fast score function estimation with application in ICA Vlassis, Nikos in Proc. Int. Conf. on Artificial Neural Networks (2001) Detailed reference viewed: 67 (0 UL)Jijo-2: An office robot that communicates and learns ; Vlassis, Nikos ; et al in IEEE Intelligent Systems (2001), 16(5), 46-55 Detailed reference viewed: 88 (0 UL)Learning Task-Relevant Features From Robot Data Vlassis, Nikos ; ; in IEEE International Conference on Robotics and Automation, 2001. Proceedings 2001 ICRA. (2001) Detailed reference viewed: 65 (0 UL)A probabilistic model for appearance-based robot localization ; Vlassis, Nikos ; et al in Image & Vision Computing (2001), 19(6), 381-391 In this paper we present a method for an appearance-based modeling of the environment of a mobile robot. We describe the task (localization of the robot) in a probabilistic framework. Linear image ... [more ▼] In this paper we present a method for an appearance-based modeling of the environment of a mobile robot. We describe the task (localization of the robot) in a probabilistic framework. Linear image features are extracted using a Principal Component Analysis. The appearance model is represented as a probability density function of the image feature vector given the location of the robot. We estimate this density model from the data with a kernel estimation method. We show how the parameters of the model influence the localization performance. We also study how many features and which features are needed for good localization. (C) 2001 Elsevier Science B.V. All rights reserved. [less ▲] Detailed reference viewed: 79 (1 UL) |
||