Wednesday 4 December 2013

Unsupervised posture detection by smart phone accelerometer

ABSTRACT:

                Human activity recognition (HAR) is achieved by analysing the contextual information collected from heterogeneous and unobtrusive wearable/built-in mobile device sensors, where an emerging topic is to infer the user’s postural actions, such as sitting, standing, walking and running. It uses either statistical tool based (e.g. hidden Markov models) or pattern recognition based (E.g. Gaussian mixture models (GMMs) , k-nearest neighbours) Classification models while detecting physical activities. However, the former methods mostly require pre defined and user-manipulated system parameter settings, such as arbitrary formation of the state transition matrix, or building filtering coefficients; while, the latter rely on first creating high dimensional feature vectors to exploit signal characteristics (e.g. mean, standard deviation, correlation, frequency and wavelet transform models) of sensory data, and then clustering these vectors according to user-manipulated (e.g. mostly visually observed) training data classes. Proposed is a light-weight unsupervised decision tree based Classification method to detect the user’s postural actions, such as sitting, standing, walking and running as user states by analysing the data from a smartphone accelerometer sensor. The proposed method differs from other approaches by applying a sufficient number of signal processing features to exploit the sensory data without knowing any a priori information. Experiments show that the proposed method still makes a solid differentiation in user states (e.g. an above 90% overall accuracy) even when the sensor is operated under slower sampling frequencies.

No comments:

Post a Comment