Now showing 1 - 2 of 2
No Thumbnail Available
Publication

Analysis of Contextual Sensors for Fall Detection

2019 , Martinez-Villaseñor, Lourdes , Ponce, Hiram

Falls are a major problem among older people and often cause serious injuries. It is important to have efficient fall detection solutions to reduce the time in which a person who suffered a fall receives assistance. Given the recent availability of cameras, wearable and ambient sensors, more research in fall detection is focused on combining different data modalities. In order to determine the positive effects of each modality and combination to improve the effectiveness of fall detection, a detailed assessment has to be done. In this paper, we analyzed different combinations of wearable devices, namely IMUs and EEG helmet, with grid of active infrared sensors for fall detection, with the aim to determine the positive effects of contextual information on the accuracy in fall detection. We used short-term memory (LSTM) networks to enable fall detection from sensors raw data. For some activities certain combinations can be helpful to discriminate other activities of daily living (ADL) from falls. © 2019 IEEE.

No Thumbnail Available
Publication

Deep Learning for Multimodal Fall Detection

2019 , Martinez-Villaseñor, Lourdes , Pérez-Daniel, Karina Ruby , Ponce, Hiram

Fall detection systems can help providing quick assistance of the person diminishing the severity of the consequences of a fall. Real-time fall detection is important to decrease fear and time that a person remains laying on the floor after falling. In recent years, multimodal fall detection approaches are developed in order to gain more precision and robustness. In this work, we propose a multimodal fall detection system based on wearable sensors, ambient sensors and vision devices. We used long short-term memory networks (LSTM) and convolutional neural networks (CNN) for our analysis given that they are able to extract features from raw data, and are well suited for real-time detection. To test our proposal, we built a public multimodal dataset for fall detection. After experimentation, our proposed method reached 96.4% in accuracy, and it represented an improvement in precision, recall and F-{1}-score over using single LSTM or CNN networks for fall detection. © 2019 IEEE.