Show simple item record

dc.contributor.authorMartinez-Villaseñor, Lourdes
dc.contributor.authorPonce, Hiram
dc.contributor.otherCampus Ciudad de México
dc.coverage.spatialMéxico
dc.creatorMARÍA DE LOURDES GUADALUPE MARTÍNEZ VILLASEÑOR;241561
dc.creatorHIRAM EREDIN PONCE ESPINOSA;376768
dc.date.accessioned2020-04-13T14:57:36Z
dc.date.available2020-04-13T14:57:36Z
dc.date.issued2019
dc.identifier.citationMartínez Villaseñor, M. de L. G. y Ponce Espinosa, H. E. (2019). A concise review on sensor signal acquisition and transformation applied to human activity recognition and human–robot interaction. International Journal of Distributed Sensor Networks, 15 (6). DOI: 10.1177/1550147719853987es_ES, en_US
dc.identifier.issn1550-1329es_ES, en_US
dc.identifier.urihttps://hdl.handle.net/20.500.12552/4925
dc.identifier.urihttp://dx.doi.org/10.1177/1550147719853987
dc.description.abstractHuman activitiy recognition deals with the integration of sensing and reasoning aiming to understand better people’s actions. Moreover, it plays an important role in human interaction, human–robot interaction, and brain–computer interaction. When these approaches have to be developed, different efforts from signal processing and artificial intelligence are considered. In that sense, this article aims to present a concise review of signal processing in human activitiy recognition systems and describe two examples and applications both in human activity recognition and robotics: human–robot interaction and socialization, and imitation learning in robotics. In addition, it presents ideas and trends in the context of human activity recognition for human–robot interaction that are important when processing signals within that systems. ©2019 SAGE Publications Ltd, The Author(s).es_ES, en_US
dc.language.isoeng
dc.publisherSAGE Publications Ltd.es_ES, en_US
dc.relation.ispartofREPOSITORIO SCRIPTAes_ES, en_US
dc.relation.ispartofREPOSITORIO NACIONAL CONACYTes_ES, en_US
dc.relation.ispartofOPENAIREes_ES, en_US
dc.rightsAcceso Abiertoes_ES, en_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0es_ES, en_US
dc.rights.urihttp://sherpa.ac.uk/romeo/issn/1550-1329/es/
dc.sourceInternational Journal of Distributed Sensor Networks
dc.subjectHuman activity recognitiones_ES, en_US
dc.subjectHuman–computer interactiones_ES, en_US
dc.subjectMachine learninges_ES, en_US
dc.subjectSensor signalses_ES, en_US
dc.subjectArtificial intelligencees_ES, en_US
dc.subjectHuman computer interactiones_ES, en_US
dc.subjectLearning systemses_ES, en_US
dc.subjectPattern recognitiones_ES, en_US
dc.subjectRoboticses_ES, en_US
dc.subjectSignal processinges_ES, en_US
dc.subjectComputer interactiones_ES, en_US
dc.subjectHuman interactionses_ES, en_US
dc.subjectImitation learninges_ES, en_US
dc.subjectProcessing signales_ES, en_US
dc.subjectRecognition systemses_ES, en_US
dc.subjectRobot interactionses_ES, en_US
dc.subjectSensor signalses_ES, en_US
dc.subjectHuman robot interactiones_ES, en_US
dc.subject.classificationINGENIERÍA Y TECNOLOGÍAes_ES, en_US
dc.subject.classificationIngeniería
dc.titleA concise review on sensor signal acquisition and transformation applied to human activity recognition and human–robot interactiones_ES, en_US
dc.typeArtículoes_ES, en_US
dcterms.audienceInvestigadores
dcterms.audienceEstudiantes
dcterms.audienceMaestros
dcterms.bibliographicCitationLoreti, D, Chesani, F, Mello, P. Complex reactive event processing for assisted living: the habitat project case study. Expert Syst Appl 2019; 126: 200–217.
dcterms.bibliographicCitationLara, OD, Labrador, MA. A survey on human activity recognition using wearable sensors. IEEE Commun Surv Tutor 2013; 15(3): 1192–1209.
dcterms.bibliographicCitationPonce, H, Martínez-Villaseñor, L, Miralles-Pechuán, L. A novel wearable sensor-based human activity recognition approach using artificial hydrocarbon networks. Sensors 2016; 16(7): E1033.
dcterms.bibliographicCitationKong, Y, Fu, Y. Human action recognition and prediction: a survey, 2018, https://arxiv.org/abs/1806.11230
dcterms.bibliographicCitationPonce, H, Miralles-Pechuán, L, Martínez-Villaseñor, L. A flexible approach for human activity recognition using artificial hydrocarbon networks. Sensors 2016; 16(11): 1715.
dcterms.bibliographicCitationPonce, H, Martínez-Villaseñor, L, Miralles-Pechuán, L. Comparative analysis of artificial hydrocarbon networks and data-driven approaches for human activity recognition. In: International conference on ubiquitous computing and ambient intelligence, Puerto Varas, Chile, 1–4 December 2015, pp.150–161. Berlin: Springer.
dcterms.bibliographicCitationVrigkas, M, Nikou, C, Kakadiaris, I. A review of human activity recognition methods. Front Robot AI 2015; 2: 28.
dcterms.bibliographicCitationRodomagoulakis, I, Kardaris, N, Pitsikalis, V. Multimodal human action recognition in assistive human-robot interaction. In: International conference on acoustics, speech and signal processing, Shanghai, China, 20–25 March 2016, pp.2702–2706. New York: IEEE.
dcterms.bibliographicCitationRamasamy Ramamurthy, S, Roy, N. Recent trends in machine learning for human activity recognitiona survey. Wiley Interdiscip Rev 2018; 8(4): e1254.
dcterms.bibliographicCitationWang, J, Chen, Y, Hao, S. Deep learning for sensor-based activity recognition: a survey. Patt Recog Lett 2019; 119: 3–11.
dcterms.bibliographicCitationVillani, V, Pini, F, Leali, F. Survey on human–robot collaboration in industrial settings: safety, intuitive interfaces and applications. Mechatronics 2018; 55: 248–266.
dcterms.bibliographicCitationVasconez, JP, Kantor, GA, Cheein, FAA. Human–robot interaction in agriculture: a survey and current challenges. Biosyst Eng 2019; 179: 35–48.
dcterms.bibliographicCitationHussein, A, Gaber, MM, Elyan, E. Imitation learning: a survey of learning methods. ACM Comput Surv 2017; 50(2): 21.
dcterms.bibliographicCitationBulling, A, Blanke, U, Schiele, B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput Surv 2014; 46: 1–33.
dcterms.bibliographicCitationChen, C, Jafari, R, Kehtarnavaz, N. A survey of depth and inertial sensor fusion for human action recognition. Multimed Tools Appl 2017; 76(3): 4405–4425.
dcterms.bibliographicCitationKoshmak, G, Loutfi, A, Linden, M. Challenges and issues in multisensor fusion approach for fall detection. J Sensors 2016; 2016: 6931789.
dcterms.bibliographicCitationGravina, R, Alinia, P, Ghasemzadeh, H. Multi-sensor fusion in body sensor networks: state-of-the-art and research challenges. Inform Fusion 2017; 35: 68–80.
dcterms.bibliographicCitationNettleton, D, Orriols-Puig, A, Fornells, A. A study of the effect of different types of noise on the precision of supervised learning techniques. Artif Intell Rev 2010; 33: 275–306.
dcterms.bibliographicCitationPhinyomark, A, Nuidod, A, Phukpattaranont, P. Feature extraction and reduction of wavelet transform coefficients for EMG pattern classification. Elektron Elektrotech 2012; 122(6): 27–32.
dcterms.bibliographicCitationAvci, A, Bosch, S, Marin-Perianu, M. Activity recognition using inertial sensing for healthcare, wellbeing and sports applications: a survey. In: Proceedings of the 23rd international conference on architecture of computing systems, Hannover, 22–23 February 2010, pp.1–10. New York: IEEE.
dcterms.bibliographicCitationDargie, W . Analysis of time and frequency domain features of accelerometer measurements. In: Proceedings of 18th international conference on computer communications and networks (ICCCN), San Francisco, CA, 3–6 August 2009, pp.1–6. New York: IEEE.
dcterms.bibliographicCitationRasekh, A, Chen, CA, Lu, Y. Human activity recognition using Smartphone, 2014, https://arxiv.org/abs/1401.8212
dcterms.bibliographicCitationAtallah, L, Lo, B, King, R. Sensor placement for activity detection using wearable accelerometers. In: International conference on body sensor networks, Singapore, 7–9 June 2010, pp.24–29. New York: IEEE.
dcterms.bibliographicCitationPreece, SJ, Goulermas, JY, Kenney, LP. A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data. IEEE Trans Biomed Eng 2009; 56(3): 871–879.
dcterms.bibliographicCitationWang, X, Xu, Y, Hu, H. Feedback-based metric learning for activity recognition. Expert Syst Appl. Epub ahead of print 10 September 2018. DOI: 10.1016/j.eswa.2018.09.021.
dcterms.bibliographicCitationHossain, HMS, Roy, N, Khan, MAAH. Active learning enabled activity recognition. In: International conference on pervasive computing and communications (PerCom), Sydney, NSW, Australia, 14–19 March 2017, pp.312–330. New York: IEEE.
dcterms.bibliographicCitationPreece, SJ, Goulermas, JY, Kenney, LP. Activity identification using body-mounted sensors-a review of classification techniques. Physiol Measure 2009; 30(4): R1–R33.
dcterms.bibliographicCitationRoggen, D, Calatroni, A, Rossi, M. Collecting complex activity datasets in highly rich networked sensor environments. In: Proceedings of the 7th international conference on networked sensing systems (INSS), Kassel, 15–18 June 2010, pp.233–240. New York: IEEE.
dcterms.bibliographicCitationDohnálek, P, Gajdoš, P, Moravec, P. Application and comparison of modified classifiers for human activity recognition. Prz Elektrotechniczny 2013; 89(11): 55–58.
dcterms.bibliographicCitationAltun, K . Intelligent sensing for robot mapping and simultaneous human localization and activity recognition. PhD Thesis, Bilkent University, Ankara, 2011.
dcterms.bibliographicCitationGuneysu, A, Arnrich, B. Socially assistive child-robot interaction in physical exercise coaching. In: Proceedings of the 26th international symposium on robot and human interactive communication (RO-MAN), Lisbon, 28 August–1 September 2017, pp.670–675. New York: IEEE.
dcterms.bibliographicCitationAggarwal, JK, Ryoo, MS. Human activity analysis: a review. ACM Comput Surv 2011; 43(3): 16.
dcterms.bibliographicCitationVishwakarma, S, Agrawal, A. A survey on activity recognition and behavior understanding in video surveillance. Visual Comput 2013; 29(10): 983–1009.
dcterms.bibliographicCitationGoodrich, MA, Schultz, AC. Human-robot interaction: a survey. Found Trend Human Comput Inter 2007; 1(3): 203–275.
dcterms.bibliographicCitationStork, JA, Spinello, L, Silva, J. Audio-based human activity recognition using non-markovian ensemble voting. In: IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication, Paris, 9–13 September 2012, pp.509–514.
dcterms.bibliographicCitationHe, W, Li, Z, Chen, CP. A survey of human-centered intelligent robots: issues and challenges. J Automatica Sinica 2017; 4(4): 602–609.
dcterms.bibliographicCitationKrüger, V, Kragic, D, Geib, C. The meaning of action: a review on action recognition and mapping. Adv Robot 2007; 21(13): 1473–1501.
dcterms.bibliographicCitationKe, SR, Thuc, HLU, Lee, YJ. A review on video-based human activity recognition. Computers 2013; 2(2): 88–131.
dcterms.bibliographicCitationPiyathilaka, L, Kodagoda, S. Human activity recognition for domestic robots. In: Mejias, L, Corke, P, Roberts, J (eds) Field and service robotics. Berlin: Springer, 2015, pp.395–408.
dcterms.bibliographicCitationFong, T, Nourbakhsh, I, Dautenhahn, K. A survey of socially interactive robots. Robot Autonom Syst 2003; 42(3): 143–166.
dcterms.bibliographicCitationStavropoulos, G, Giakoumis, D, Moustakas, K. Automatic action recognition for assistive robots to support MCI patients at home. In: Proceedings of the 10th international conference on pervasive technologies related to assistive environments, Rhodes, 21–23 June 2017, pp.366–371. New York: ACM.
dcterms.bibliographicCitationYang, X, Tian, Y. Eigenjoints-based action recognition using naive-bayes-nearest-neighbor. In: Proceedings of the international conference on computer vision and pattern recognition, Providence, RI, 16–21 June 2012, pp.14–19. New York: IEEE.
dcterms.bibliographicCitationBuys, K, Cagniart, C, Baksheev, A. An adaptable system for RGB-D based human body detection and pose estimation. J Visual Commun Image Represent 2014; 25(1): 39–52.
dcterms.bibliographicCitationScassellati, B, Admoni, H, Mataric, M. Robots for use in autism research. Ann Rev Biomed Eng 2012; 14: 275–294.
dcterms.bibliographicCitationFasola, J, Mataric, MJ. Using socially assistive human–robot interaction to motivate physical exercise for older adults. Proc IEEE 2012; 100(8): 2512–2526.
dcterms.bibliographicCitationXia, L, Gori, I, Aggarwal, JK. Robot-centric activity recognition from first-person RGB-D videos. In: IEEE winter conference on applications of computer vision, Waikoloa, HI, 5–9 January 2015, pp.357–364. New York: IEEE.
dcterms.bibliographicCitationSidobre, D, Broquere, X, Mainprice, J. Human–robot interaction. Berlin: Springer, 2012.
dcterms.bibliographicCitationLiu, H, Wang, L. Gesture recognition for human-robot collaboration: a review. Int J Indus Ergonomic 2018; 68: 355–367.
dcterms.bibliographicCitationMitra, S, Acharya, T. Gesture recognition: a survey. IEEE Trans Syst Man Cybernet C 2007; 37(3): 311–324.
dcterms.bibliographicCitationRautaray, SS, Agrawal, A. Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev 2015; 43(1): 1–54.
dcterms.bibliographicCitationZhang, X, Chen, X, Li, Y. A framework for hand gesture recognition based on accelerometer and EMG sensors. IEEE Trans Syst Man Cybernet A 2011; 41(6): 1064–1076.
dcterms.bibliographicCitationZhu, C, Sheng, W. Wearable sensor-based hand gesture and daily activity recognition for robot-assisted living. IEEE Trans Syst Man Cybernet A 2011; 41(3): 569–573.
dcterms.bibliographicCitationChaudhary, A, Raheja, JL, Das, K. Intelligent approaches to interact with machines using hand gesture recognition in natural way: a survey, 2013, https://arxiv.org/abs/1303.2292
dcterms.bibliographicCitationZhu, C, Sheng, W. Human daily activity recognition in robot-assisted living using multi-sensor fusion. In: International conference on robotics and automation, Kobe, Japan, 12–17 May 2009, pp.2154–2159. New York: IEEE.
dcterms.bibliographicCitationSheng, W, Du, J, Cheng, Q. Robot semantic mapping through human activity recognition: a wearable sensing and computing approach. Robot Autonom Syst 2015; 68: 47–58.
dcterms.bibliographicCitationMaxime, J, Alameda-Pineda, X, Girin, L. Sound representation and classification benchmark for domestic robots. In: International conference on robotics and automation, Hong Kong, China, 31 May–7 June 2014, pp.6285–6292. New York: IEEE.
dcterms.bibliographicCitationGouaillier, D, Hugel, V, Blazevic, P. Mechatronic design of NAO humanoid. In: International conference on robotics and automation, Kobe, Japan, 12–17 May 2009, pp.769–774. New York: IEEE.
dcterms.bibliographicCitationArgall, BD, Chernova, S, Veloso, M. A survey of robot learning from demonstration. Robot Autonom Syst 2009; 57(5): 469–483.
dcterms.bibliographicCitationArgall, BD, Billard, AG. A survey of tactile human–robot interactions. Robot Autonom Syst 2010; 58(10): 1159–1176.
dcterms.bibliographicCitationWosch, T, Feiten, W. Reactive motion control for human-robot tactile interaction. In: International conference on robotics and automation, Washington, DC, 11–15 May 2002, Vol. 4, pp.3807–3812. New York: IEEE.
dcterms.bibliographicCitationShibata, T . An overview of human interactive robots for psychological enrichment. Proc IEEE 2004; 11: 1749–1758.
dcterms.bibliographicCitationSilvera-Tawil, D, Rye, D, Velonaki, M. Artificial skin and tactile sensing for socially interactive robots: a review. Robot Autonom Syst 2015; 63: 230–243.
dcterms.bibliographicCitationWada, K, Shibata, T. Living with seal robots—its sociopsychological and physiological influences on the elderly at a care house. IEEE Trans Robot 2007; 23(5): 972–980.
dcterms.bibliographicCitationRobins, B, Amirabdollahian, F, Ji, Z. Tactile interaction with a humanoid robot for children with autism: a case study analysis involving user requirements and results of an initial implementation. In: International symposium on robot and human interactive communication, Viareggio, 13–15 September 2010, pp.704–711. New York: IEEE.
dcterms.bibliographicCitationHan, J, Campbell, N, Jokinen, K. Investigating the use of non-verbal cues in human-robot interaction with a NAO robot. In: Proceedings of the 3rd international conference on cognitive infocommunications, Kosice, 2–5 December 2012, pp.679–683. New York: IEEE.
dcterms.bibliographicCitationPieropan, A . Action recognition for robot learning. PhD Thesis, KTH Royal Institute of Technology, Stockholm, 2015.
dcterms.bibliographicCitationBandera, JP . Vision-based gesture recognition in a robot learning by imitation framework. PhD Thesis, Universidad de Malaga, Málaga, 2010.
dcterms.bibliographicCitationBandera, JP, Rodriguez, JA, Molina-Tanco, L. A survey of vision-based architectures for robot learning by imitation. Int J Humanoid Robot 2012; 9(1): 1250006.
dcterms.bibliographicCitationOsa, T, Pajarinen, J, Neumann, G. An algorithmic perspective on imitation learning. Found Trend Robot 2018; 7(1–2): 1–179.
dcterms.bibliographicCitationSchaal, S . Is imitation learning the route to humanoid robots? Trend Cognitive Sci 1999; 3(6): 233–242.
dcterms.bibliographicCitationMühlig, M, Gienger, M, Hellbach, S. Task-level imitation learning using variance-based movement optimization. In: International conference on robotics and automation, Kobe, Japan, 12–17 May 2009, pp.1177–1184. New York: IEEE.
dcterms.bibliographicCitationCalinon, S, Guenter, F, Billard, A. Goal-directed imitation in a humanoid robot. In: International conference on robotics and automation, Barcelona, 18–22 April 2005, pp.18–22. New York: IEEE.
dcterms.bibliographicCitationAsfour, T, Gyarfas, F, Azad, P. Imitation learning of dual-arm manipulation tasks in humanoid robots. In: Proceedings of the 6th IEEE RAS international conference on humanoid robots, Genova, 4–6 December 2006, pp.40–47. New York: IEEE.
dcterms.bibliographicCitationBreazeal, C, Scassellati, B. Challenges in building robots that imitate people. Cambridge, MA: The MIT Press, 2002.
dcterms.bibliographicCitationBurns, R, Jeon, M, Park, C. Robotic motion learning framework to promote social engagement. Appl Sci 2018; 8(2): 241.
dcterms.bibliographicCitationNehaniv, C, Dautenhahn, K. The correspondence problem. Cambridge, MA: The MIT Press, 2002
dcterms.bibliographicCitationUde, A, Atkeson, CG, Riley, M. Programming full-body movements for humanoid robots by observation. Robot Autonom Syst 2004; 47(2): 93–108.
dcterms.bibliographicCitationDariush, B, Gienger, M, Arumbakkam, A. Online transfer of human motion to humanoids. Int J Humanoid Robot 2009; 6(2): 265–289.
dcterms.bibliographicCitationJin, S, Dai, C, Liu, Y. Motion imitation based on sparsely sampled correspondence. J Comput Inf Sci Eng 2016; 17: 041009.
dcterms.bibliographicCitationChella, A, Dindo, H, Infantino, I. A cognitive framework for imitation learning. Robot Autonom Syst 2006; 54(5): 403–408.
dcterms.bibliographicCitationIkemoto, S, Amor, HB, Minato, T. Physical human-robot interaction: mutual learning and adaptation. IEEE Robot Automat Mag 2012; 19(4): 2435.
dcterms.bibliographicCitationYazdi, M, Bouwmans, T. New trends on moving object detection in video images captured by a moving camera: a survey. Comput Sci Rev 2018; 28(5): 157–177.
dcterms.bibliographicCitationKatsamanis, A, Pitsikalis, V, Theodorakis, S. Multimodal gesture recognition. In: Oviatt, S, Schuller, B, Cohen, PR. (eds) The handbook of multimodal-multisensor interfaces. New York: Association for Computing Machinery; Morgan & Claypool, 2017, pp.449–487.
dcterms.bibliographicCitationIshiguro, H, Nishio, S. Building artificial humans to understand humans. J Artif Organs 2018; 10: 133–142.
dcterms.bibliographicCitationHanson Robotics . Hi, I Am Sophia, https://www.hansonrobotics.com/sophia/ (2019, accessed 9 April 2019).
dcterms.bibliographicCitationCook, D, Feuz, KD, Krishnan, NC. Transfer learning for activity recognition: a survey. Know Inform Syst 2013; 36(3): 537–556.
dcterms.bibliographicCitationKober, J, Bagnell, JA, Peters, J. Reinforcement learning in robotics: a survey. The Int J Robot Res 2013; 32(11): 1238–1274.
dc.description.versionVersión del editor


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Acceso Abierto
Except where otherwise noted, this item's license is described as Acceso Abierto