![]() ; ; Bavle, Hriday ![]() in Journal of Intelligent and Robotic Systems (2019), 95(2), 601--627 Search and Rescue (SAR) missions represent an important challenge in the robotics research field as they usually involve exceedingly variable-nature scenarios which require a high-level of autonomy and ... [more ▼] Search and Rescue (SAR) missions represent an important challenge in the robotics research field as they usually involve exceedingly variable-nature scenarios which require a high-level of autonomy and versatile decision-making capabilities. This challenge becomes even more relevant in the case of aerial robotic platforms owing to their limited payload and computational capabilities. In this paper, we present a fully-autonomous aerial robotic solution, for executing complex SAR missions in unstructured indoor environments. The proposed system is based on the combination of a complete hardware configuration and a flexible system architecture which allows the execution of high-level missions in a fully unsupervised manner (i.e. without human intervention). In order to obtain flexible and versatile behaviors from the proposed aerial robot, several learning-based capabilities have been integrated for target recognition and interaction. The target recognition capability includes a supervised learning classifier based on a computationally-efficient Convolutional Neural Network (CNN) model trained for target/background classification, while the capability to interact with the target for rescue operations introduces a novel Image-Based Visual Servoing (IBVS) algorithm which integrates a recent deep reinforcement learning method named Deep Deterministic Policy Gradients (DDPG). In order to train the aerial robot for performing IBVS tasks, a reinforcement learning framework has been developed, which integrates a deep reinforcement learning agent (e.g. DDPG) with a Gazebo-based simulator for aerial robotics. The proposed system has been validated in a wide range of simulation flights, using Gazebo and PX4 Software-In-The-Loop, and real flights in cluttered indoor environments, demonstrating the versatility of the proposed system in complex SAR missions. [less ▲] Detailed reference viewed: 51 (2 UL)![]() ; Bavle, Hriday ![]() in International Journal of Micro Air Vehicles (2018), 10(4), 352--361 The lack of redundant attitude sensors represents a considerable yet common vulnerability in many low-cost unmanned aerial vehicles. In addition to the use of attitude sensors, exploiting the horizon as a ... [more ▼] The lack of redundant attitude sensors represents a considerable yet common vulnerability in many low-cost unmanned aerial vehicles. In addition to the use of attitude sensors, exploiting the horizon as a visual reference for attitude control is part of human pilots' training. For this reason, and given the desirable properties of image sensors, quite a lot of research has been conducted proposing the use of vision sensors for horizon detection in order to obtain redundant attitude estimation onboard unmanned aerial vehicles. However, atmospheric and illumination conditions may hinder the operability of visible light image sensors, or even make their use impractical, such as during the night. Thermal infrared image sensors have a much wider range of operation conditions and their price has greatly decreased during the last years, becoming an alternative to visible spectrum sensors in certain operation scenarios. In this paper, two attitude estimation methods are proposed. The first method consists of a novel approach to estimate the line that best fits the horizon in a thermal image. The resulting line is then used to estimate the pitch and roll angles using an infinite horizon line model. The second method uses deep learning to predict attitude angles using raw pixel intensities from a thermal image. For this, a novel Convolutional Neural Network architecture has been trained using measurements from an inertial navigation system. Both methods presented are proven to be valid for redundant attitude estimation, providing RMS errors below 1.7° and running at up to 48 Hz, depending on the chosen method, the input image resolution and the available computational capabilities. [less ▲] Detailed reference viewed: 44 (1 UL)![]() ; Bavle, Hriday ![]() in 2017 International Conference on Unmanned Aircraft Systems (ICUAS) (2017, June) Detailed reference viewed: 70 (1 UL)![]() ; Sanchez Lopez, Jose Luis ![]() in Journal of Intelligent and Robotic Systems (2016), 84(1-4), 601--620 This paper presents a completely autonomous solution to participate in the 2013 International Micro Air Vehicle Indoor Flight Competition ({IMAV2013}). Our proposal is a modular multi-robot swarm ... [more ▼] This paper presents a completely autonomous solution to participate in the 2013 International Micro Air Vehicle Indoor Flight Competition ({IMAV2013}). Our proposal is a modular multi-robot swarm architecture, based on the Robot Operating System (ROS) software framework, where the only information shared among swarm agents is each robot's position. Each swarm agent consists of an {AR Drone 2.0} quadrotor connected to a laptop which runs the software architecture. In order to present a completely visual-based solution the localization problem is simplified by the usage of ArUco visual markers. These visual markers are used to sense and map obstacles and to improve the pose estimation based on the IMU and optical data flow by means of an Extended Kalman Filter localization and mapping method. The presented solution and the performance of the CVG\_UPM team were awarded with the First Prize in the Indoors Autonomy Challenge of the {IMAV2013} competition. [less ▲] Detailed reference viewed: 47 (3 UL)![]() ; ; Sanchez Lopez, Jose Luis ![]() in Robot 2015: Second Iberian Robotics Conference (2016, November) Detailed reference viewed: 53 (3 UL)![]() ; ; et al in Dyna (2016), 91(3), 282--288 Detailed reference viewed: 48 (3 UL)![]() ; ; Sanchez Lopez, Jose Luis ![]() in Sensors (2015), 15(11), 29569--29593 Lateral flow assay tests are nowadays becoming powerful, low-cost diagnostic tools. Obtaining a result is usually subject to visual interpretation of colored areas on the test by a human operator ... [more ▼] Lateral flow assay tests are nowadays becoming powerful, low-cost diagnostic tools. Obtaining a result is usually subject to visual interpretation of colored areas on the test by a human operator, introducing subjectivity and the possibility of errors in the extraction of the results. While automated test readers providing a result-consistent solution are widely available, they usually lack portability. In this paper, we present a smartphone-based automated reader for drug-of-abuse lateral flow assay tests, consisting of an inexpensive light box and a smartphone device. Test images captured with the smartphone camera are processed in the device using computer vision and machine learning techniques to perform automatic extraction of the results. A deep validation of the system has been carried out showing the high accuracy of the system. The proposed approach, applicable to any line-based or color-based lateral flow test in the market, effectively reduces the manufacturing costs of the reader and makes it portable and massively available while providing accurate, reliable results. [less ▲] Detailed reference viewed: 66 (0 UL)![]() ; Sanchez Lopez, Jose Luis ![]() in 2014 International Conference on Unmanned Aircraft Systems (ICUAS) (2014, May) This paper presents a completely autonomous solution to participate in the 2013 International Micro Air Vehicle Indoor Flight Competition (IMAV2013). Our proposal is a modular multi-robot swarm ... [more ▼] This paper presents a completely autonomous solution to participate in the 2013 International Micro Air Vehicle Indoor Flight Competition (IMAV2013). Our proposal is a modular multi-robot swarm architecture, based on the Robot Operating System (ROS) software framework, where the only information shared among swarm agents is each robot's position. Each swarm agent consists of an AR Drone 2.0 quadrotor connected to a laptop which runs the software architecture. In order to present a completely visual-based solution the localization problem is simplified by the usage of ArUco visual markers. These visual markers are used to sense and map obstacles and to improve the pose estimation based on the IMU and optical data flow by means of an Extended Kalman Filter localization and mapping method. The presented solution and the performance of the CVG UPM team were awarded with the First Prize in the Indoors Autonomy Challenge of the IMAV2013 competition. [less ▲] Detailed reference viewed: 83 (1 UL)![]() Sanchez Lopez, Jose Luis ![]() in ROBOT2013: First Iberian Robotics Conference (2013, November) Detailed reference viewed: 71 (2 UL)![]() ![]() Sanchez Lopez, Jose Luis ![]() Scientific Conference (2013, September) Detailed reference viewed: 41 (0 UL) |
||