Smartphone-based object recognition with embedded machine learning intelligence for unmanned aerial vehicles

Ignacio Martinez-Alpiste, Pablo Casaseca-de-la-Higuera, Jose M. Alcaraz-Calero, Christos Grecos, Qi Wang

    Research output: Contribution to journalArticlepeer-review

    13 Citations (Scopus)
    244 Downloads (Pure)

    Abstract

    Existing artificial intelligence solutions typically operate in powerful platforms with high computational resources availability. However, a growing number of emerging use cases such as those based on unmanned aerial systems (UAS) require new solutions with embedded artificial intelligence on a highly mobile platform. This paper proposes an innovative UAS that explores machine learning (ML) capabilities in a smartphone-based mobile platform for object detection and recognition applications. A new system framework tailored to this challenging use case is designed with a customized workflow specified. Furthermore, the design of the embedded ML leverages TensorFlow, a cutting-edge open-source ML framework. The prototype of the system integrates all the architectural components in a fully functional system, and it is suitable for real-world operational environments such as seek and rescue use cases. Experimental results validate the design and prototyping of the system and demonstrate an overall improved performance compared with the state of the art in terms of a wide range of metrics.

    Original languageEnglish
    Pages (from-to)404-420
    Number of pages17
    JournalJournal of Field Robotics
    Volume37
    Issue number3
    Early online date6 Nov 2019
    DOIs
    Publication statusPublished - 30 Apr 2020

    Keywords

    • Machine learning
    • Object detection and recognition
    • Unmanned aerial vehicle (UAV)
    • Image processing
    • Smartphone

    Fingerprint

    Dive into the research topics of 'Smartphone-based object recognition with embedded machine learning intelligence for unmanned aerial vehicles'. Together they form a unique fingerprint.

    Cite this