Abstract :
[en] In this paper we introduce a real-time trinocular system to control rotary wing Unmanned Aerial Vehicles based on the 3D information extracted by cameras located on the ground. The algorithm is based on key features onboard the UAV to estimate the vehicle's position and orientation. The algorithm is validated against onboard sensors and known 3D positions, showing that the proposed camera configuration robustly estimates the helicopter's position with an adequate resolution, improving the position estimation, especially the height estimation. The obtained results show that the proposed algorithm is suitable to complement or replace the GPS-based position estimation in situations where GPS information is unavailable or where its information is inaccurate, allowing the vehicle to develop tasks at low heights, such as autonomous landing, take-off, and positioning, using the extracted 3D information as a visual feedback to the flight controller.
Martinez, Carol; Univ Politecnica Madrid, Computer Vision Group, ETSII, E-28006 Madrid, Spain.
Campoy, Pascual; Univ Politecnica Madrid, Computer Vision Group, ETSII, E-28006 Madrid, Spain.
Mondragon, Ivan; Univ Politecnica Madrid, Computer Vision Group, ETSII, E-28006 Madrid, Spain.
Event organizer :
IEEE Robot & Automat Soc, Robot Soc Japan, Soc Instrument & Control Engn, IEEE Ind Elect Soc, Inst Control, Robot & Syst Korea, ABB, Barrett Technol, Inc, Willow Garage, ROBOTIS, Aldebaran Robot
Scopus citations®
without self-citations
37