Towards smart homes using low level sensory data

Asad Masood Khattak, Phan Tran Ho Truc, Le Xuan Hung, La The Vinh, Viet-hung Dang, Donghai Guan, Zeeshan Pervez, Manhyung Han, Sungyoung Lee. Lee, Young-Koo Lee

Research output: Contribution to journalArticlepeer-review

23 Citations (Scopus)


Ubiquitous Life Care (u-Life care) is receiving attention because it provides high quality and low cost care services. To provide spontaneous and robust healthcare services, knowledge of a patient’s real-time daily life activities is required. Context information with real-time daily life activities can help to provide better services and to improve healthcare delivery. The performance and accuracy of existing life care systems is not reliable, even with a limited number of services. This paper presents a Human Activity Recognition Engine (HARE) that monitors human health as well as activities using heterogeneous sensor technology and processes these activities intelligently on a Cloud platform for providing improved care at low cost. We focus on activity recognition using video-based, wearable sensor-based, and location-based activity recognition engines and then use intelligent processing to analyze the context of the activities performed. The experimental results of all the components showed good accuracy against existing techniques. The system is deployed on Cloud for Alzheimer’s disease patients (as a case study) with four activity recognition engines to identify low level activity from the raw data captured by sensors. These are then manipulated using ontology to infer higher level activities and make decisions about a patient’s activity using patient profile information and customized rules.
Original languageEnglish
Pages (from-to)11581-11604
Number of pages24
Issue number12
Publication statusPublished - 12 Dec 2011
Externally publishedYes


  • accelerometer
  • location sensor
  • video sensor
  • u-healthcare
  • activity recognition


Dive into the research topics of 'Towards smart homes using low level sensory data'. Together they form a unique fingerprint.

Cite this