IntroductionSitting patterns predict several healthy aging outcomes. These patterns can potentially be measured using hip-worn accelerometers, but current methods are limited by an inability to detect postural transitions. To overcome these limitations, we developed the Convolutional Neural Network Hip Accelerometer Posture (CHAP) classification method.
MethodsCHAP was developed on 709 older adults who wore an ActiGraph GT3X+ accelerometer on the hip, with ground-truth sit/stand labels derived from concurrently worn thigh-worn activPAL inclinometers for up to 7 d. The CHAP method was compared with traditional cut-point methods of sitting pattern classification as well as a previous machine-learned algorithm (two-level behavior classification).
ResultsFor minute-level sitting versus nonsitting classification, CHAP performed better (93% agreement with activPAL) than did other methods (74%-83% agreement). CHAP also outperformed other methods in its sensitivity to detecting sit-to-stand transitions: cut-point (73%), TLBC (26%), and CHAP (83%). CHAP's positive predictive value of capturing sit-to-stand transitions was also superior to other methods: cut-point (30%), TLBC (71%), and CHAP (83%). Day-level sitting pattern metrics, such as mean sitting bout duration, derived from CHAP did not differ significantly from activPAL, whereas other methods did: activPAL (15.4 min of mean sitting bout duration), CHAP (15.7 min), cut-point (9.4 min), and TLBC (49.4 min).
ConclusionCHAP was the most accurate method for classifying sit-to-stand transitions and sitting patterns from free-living hip-worn accelerometer data in older adults. This promotes enhanced analysis of older adult movement data, resulting in more accurate measures of sitting patterns and opening the door for large-scale cohort studies into the effects of sitting patterns on healthy aging outcomes.