Ignacio Martinez-Alpiste, Gelayol Golcarenarenji, Qi Wang and Jose Maria Alcaraz-Calero
Conference | 23nd ACM International Conference on Modeling, Analysis and Simulation of Wireless and Mobile Systems |
Symposium | 10th ACM International Symposium on Design and Analysis of Intelligent Vehicular Networks and Applications (DIVANet'20) |
Track | SD2-Vehicular Networks |
Core | A |
Location | Alicante |
Number of pages | 7 |
Early online date | 21st of September |
Original language | English |
To improve the speed and accuracy in human detection in Search and Rescue (SAR) operations, this paper presents a novel and highly efficient machine learning empowered system by extending the You Only Look Once (YOLO) algorithm, which is designed and deployed on an embedded system. The proposed approach has been evaluated under real-world conditions on a Jetson AGX Xavier platform and the results have shown a well-balanced system in terms of accuracy, speed and portability. Moreover, the system demonstrates its resilience to perform low-pixel human detection on infrared images received from an Unmanned Aerial Vehicle (UAV) at low-light conditions, different altitudes and postures such as sitting, walking and running. The proposed approach has achieved in a constrained environment a total of 89.26% of accuracy and 24.6 FPS, surpassing the barrier of real-time object recognition.
DOI:10.1145/3416014.3424600