Volltext-Downloads (blau) und Frontdoor-Views (grau)

Simple yet efficient real-time pose-based action recognition

  • Recognizing human actions is a core challenge for autonomous systems as they directly share the same space with humans. Systems must be able to recognize and assess human actions in real-time. To train the corresponding data-driven algorithms, a significant amount of annotated training data is required. We demonstrate a pipeline to detect humans, estimate their pose, track them over time and recognize their actions in real-time with standard monocular camera sensors. For action recognition, we transform noisy human pose estimates in an image like format we call Encoded Human Pose Image (EHPI). This encoded information can further be classified using standard methods from the computer vision community. With this simple procedure, we achieve competitive state-of-the-art performance in pose based action detection and can ensure real-time performance. In addition, we show a use case in the context of autonomous driving to demonstrate how such a system can be trained to recognize human actions using simulation data.

Download full text files

  • 2510.pdf
    eng

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author of HS ReutlingenLudl, Dennis; Gulde, Thomas; Curio, Cristóbal
DOI:https://doi.org/10.1109/ITSC.2019.8917128
ISBN:978-1-5386-7024-8
Erschienen in:2019 IEEE Intelligent Transportation Systems Conference (ITSC) : Auckland, New Zealand, 27-30 October 2019
Publisher:IEEE
Place of publication:Piscataway, NJ
Document Type:Conference proceeding
Language:English
Publication year:2019
Page Number:8
First Page:581
Last Page:588
DDC classes:004 Informatik
Open access?:Nein
Licence (German):License Logo  In Copyright - Urheberrechtlich geschützt