Interactive and Interpretable Online Human Activity Recognition
by , ,
Abstract:
This demo paper puts forward our interactive and inspectable real-time recognition demo for human activity recognition. The demo captures online data from wearable sensors such as inertial measurement units, recognizes the performed human activity using Hidden Markov Models, and displays the search's current state along with the recognition results in real-time. Therefore, allowing users to interact with the system and probe its recognition by altering their actions and comparing their expectations with the demo's performance. Students, Researchers, or the general public can easily extend the detected classes by utilizing the integrated recording, segmentation, and re-training tools — all while running on moderate hardware or even IoT devices like a Raspberry Pi. The demo is titled "ASK2.0" and provides an easy-to-use and powerful way to teach and explain real-time human activity recognition and machine learning.
Reference:
Interactive and Interpretable Online Human Activity Recognition (Yale Hartmann, Hui Liu, Tanja Schultz), In PERCOM 2022 - 20th IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), 2022.
Bibtex Entry:
@inproceedings{hartmann2022demo,
  title = {Interactive and Interpretable Online Human Activity Recognition},
  author = {Hartmann, Yale and Liu, Hui and Schultz, Tanja},
  booktitle = {PERCOM 2022 - 20th IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)}, 
  title={Interactive and Interpretable Online Human Activity Recognition},
  year = {2022},
  pages={109-111},
  abstract = {This demo paper puts forward our interactive and inspectable real-time recognition demo for human activity recognition. The demo captures online data from wearable sensors such as inertial measurement units, recognizes the performed human activity using Hidden Markov Models, and displays the search's current state along with the recognition results in real-time. Therefore, allowing users to interact with the system and probe its recognition by altering their actions and comparing their expectations with the demo's performance. Students, Researchers, or the general public can easily extend the detected classes by utilizing the integrated recording, segmentation, and re-training tools — all while running on moderate hardware or even IoT devices like a Raspberry Pi. The demo is titled "ASK2.0" and provides an easy-to-use and powerful way to teach and explain real-time human activity recognition and machine learning.},
  doi={10.1109/PerComWorkshops53856.2022.9767207},
  url = {https://www.csl.uni-bremen.de/cms/images/documents/publications/HartmannLiuSchultz_PERCOM2022.pdf}
}