Publicado:

2022-11-26

Número:

Vol. 16 Núm. 2 (2022)

Sección:

Visión Actual

State of the art on technological trends for the analysis of behavior and human activities

Revisión del estado del arte sobre tendencias tecnológicas para el análisis del comportamiento y actividades humanas

Autores/as

  • Freddy Oswaldo Ovalles-Pabón Servicio Nacional de Aprendizaje - SENA

Palabras clave:

Human Activity Recognition, Human behavior, Sensors (en).

Palabras clave:

Reconocimiento de actividades humanas, Comportamiento humano, Sensores (es).

Resumen (en)

The study of human behavior allows the knowledge about people's behaviors, behavior determined by multiple factors: cultural, social, psychological, genetic, religious, among others, which affect the relationships and interaction with the environment. The infinity of data in our lives and the search for behavioral patterns from that data has been an amazing work whose benefit is focused on the determined patterns and intelligent analysis that lead to new knowledge. A significant amount of resources from pattern recognition in human activities and daily life has had greater dominance in the management of mobility, health and wellness.
The current paper presents a review of technologies for human behavior analysis and use as tools for diagnosis, assistance, for interaction in intelligent environments and assisted robotics applications. The main scope is to give an overview of the technological advances in the analysis of human behavior, activities of daily living and mobility, and the benefits obtained.

Resumen (es)

El estudio del comportamiento humano permite el conocimiento sobre las conductas de las personas, conducta determinada por múltiples factores: culturales, sociales, psicológicos, genéticos, religiosos, entre otros; que inciden en las relaciones y la interacción con el entorno. La infinidad de datos en nuestras vidas y la búsqueda de patrones de comportamiento a partir de esos datos ha sido un trabajo asombroso cuyo provecho se centra en los patrones determinados y el análisis inteligente que conducen a nuevos conocimientos. Una cantidad significativa de recursos a partir del reconocimiento de patrones en las actividades humanas y de vida diaria ha tenido mayor dominio en la gestión de la movilidad, la salud y bienestar.
El actual documento presenta una revisión de las tecnologías para el análisis del comportamiento humano y del uso como herramientas para el diagnóstico, asistencia, para la interacción en ambientes inteligentes y aplicaciones de robótica asistida. El alcance principal es dar una visión general de los avances tecnológicos en el análisis del comportamiento humano, actividades de la vida diaria y movilidad, y de los beneficios obtenidos.

Referencias

E. M. Tapia, E. Munguia, “Activity recognition in the home setting using simple and ubiquitous sensors”, international conference on pervasive computing, Berlin, Heidelberg, Springer Berlin Heidelberg, 2004, pp. 158--175.

C. Liming et al, “Sensor-based activity recognition”, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 42, no. 6, pp. 790 - 808, 2012.

N. Wei et al, “Human activity detection and recognition for video surveillance”, de 2004 IEEE International Conference on Multimedia and Expo (ICME), IEEE, 2004, pp. 719--722.

M. S. Ryoo, “Human activity prediction: Early recognition of ongoing activities from streaming videos”, de 2011 International Conference on Computer Vision, IEEE, 2011, pp. 1036--1043.

R. Nishkam, D. Nikhil et al., “Activity recognition from accelerometer data”, de Aaai, 2005, pp. 1541--1546.

Intille, L. Bao and S. S., “Activity recognition from user-annotated acceleration data”, de International conference on pervasive computing, 2004.

N. Belapurkar, S. Sagar and A. Baris, “The Case for Ambient Sensing for Human Activity Detection”, de Proceedings of the 8th International Conference on the Internet of Things, New, York, 2018.

D. Anguita et al, International workshop on ambient assisted living, Springer, 2012.

E. Kim, S. Helal and D. Cook, “Human activity recognition and pattern discovery”, IEEE Pervasive Computing/IEEE Computer Society [and] IEEE Communications Society, vol. 9, no. 1, p. 48, 2010.

B. P. Clarkson, Life patterns: structure from wearable sensors, Massachusetts Institute of Technology, 2002.

J. Shotton, T. Sharp et al., “Real-time Human Pose Recognition in Parts from Single Depth Images”, Commun. ACM, vol. 56, no. 1, pp. 116--124, 2013.

R. Poppe, “A survey on vision-based human action recognition”, Image and vision computing, vol. 28, no. 6, pp. 976--990, 2010.

J. K Aggarwal and M. S. Ryoo, “Human activity analysis: A review”, ACM Computing Surveys (CSUR), vol. 43, no. 3, p. 16, 2011.

D. Weinland, R. Ronfard and Ed Boyer, “A survey of vision-based methods for actionrepresentation, segmentation and recognition”, Computer vision and image understanding, vol. 115, no. 2, pp. 224 -- 241, 2011.

V. Argyriou, M. Petrou and S. Barsky, “Photometric stereo with an arbitrary number of illuminants”, Computer Vision and Image Understanding, vol. 14, nº 8, pp. 887--900, 2010.

R. Chavarriaga, H. Sagha et al, “The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition”, Pattern Recognition Letters, vol. 34, no. 15, pp. 2033--2042, 2013.

T. Plötz, N. Y. Hammerla and P. Oliver, “Feature Learning for Activity Recognition in Ubiquitous Computing” de Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, Barcelona, AAAI Press, 2011, pp. 1729--1734.

A. Ferscha and F. Mattern, Pervasive Computing: Second International Conference, PERVASIVE 2004, Linz, Vienna: Springer, 2004.

N. Ravi, D. Nikhil et al, “Activity recognition from accelerometer data”, de Aaai, 2005, pp. 1541--1546.

L. B. a. S. Intille, “Activity recognition from user-annotated acceleration data”, de International conference on pervasive computing, 2004.

G. Z. Yang, and M. Yacoub, Body Sensor Networks. 2006, London: Springer, 2006.

D. Anguita, A. Ghio et al, “A Public Domain Dataset for Human Activity Recognition using Smartphones”, de 21th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), 2013.

D. Roggen, K. Forster at al, “OPPORTUNITY: Towards opportunistic activity and context recognition systems”, de 2009 IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks & Workshops, 2009.

A. M. Khan, Y-K. Lee et al, “Human activity recognition via an accelerometer-enabled smartphone using kernel discriminant analysis”, de 2010 5th international conference on future information technology, 2010.

J. Reyes-Ortiz, L. Oneto et al, “Transition-aware human activity recognition using smartphones”, Transition-aware human activity recognition using smartphones, vol. 171, pp. 754--767, 2016.

S. I. Yang and S. B. Cho, “Recognizing human activities from accelerometer and physiological sensors”, de 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 2008.

R. Poovandran, “Human activity recognition for video surveillance”, de 2008 IEEE International Symposium on Circuits and Systems, 2008.

C. Hlavac, “Pose primitive based human action recognition in videos or still images”, de 2008 IEEE Conference on Computer Vision and Pattern Recognition, 2008.

J. Caros, O. Chetelat, P. Celka et al, “Very low complexity algorithm for ambulatory activity classification”, de EMBEC, 2005.

M. F. Bin Abdullah et al, “Classification Algorithms in Human Activity Recognition using Smartphones”, World Academy of Science, Engineering and Technology International Journal of Biomedical and Biological Engineering, vol. 6, no. 1, 2012.

O. D. Lara and M. A. Labrador, “A survey on human activity recognition using wearable sensors”, pp. 1192-1209, 2013.

N. Robertson and I. Reid, “A general method for human activity recognition in video”, Computer Vision and Image Understanding, vol. 104, no. 2-3, pp. 232--248, 2006.

C. Thurau and V Hlavac, “Pose primitive based human action recognition in videos or still images”, de 2008 IEEE Conference on Computer Vision and Pattern Recognition, 2008.

R. Poovsndran, “Human activity recognition for video surveillance”, de 2008 IEEE International Symposium on Circuits and Systems, 2008.

W. Niu, J. Long, D. Han and W. Yuan-Fang , “Human Activity Detection and Recognition for Video Surveillance”, 2004 IEEE International Conference on Multimedia and Expo (ICME), vol. 1, pp. 719-- 722, 2004.

J. M. Ermes, J. Parkka, J. Mantyjarvi, and I. Korhonen, “Detection of daily activities and sports with wearable sensors in controlled and uncontrolled conditions”, TITB, vol. 12, no. 1, pp. 20--26, 2008.

X. Long, B. Yin and R. M. Aarts, “Singleaccelerometer-based daily physical activity classification”, de EMBS, 2009.

D. Karantonis, M. Narayanan, M. Mathier, et al, “Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring”, TITB, vol. 10, no. 1, pp. 156-167, 2006.

E. Heinz, K. Kunze, M. Gruber et al, “Using wearable sensors for Real-Time recognition tasks in games of martial arts - an initial experiment”, de GIC´06, 2006.

H. Markus, H. Takafumi, et al, “Chi-ball, an interactive device assisting martial arts”, de CHI´03, 2003.

J. Liao,Y. Bi and C. Nugent , “Activity recognition for smart Homes using Dempster-Shafer theory of evidence based on a revised lattice structure”, de 2010 Sixth International Conference on Intelligent Environments, 2010.

F. Cicirelli,G. Fortino, A. giordano et al, “On the design of smar homes framework for activyty recpgnition in home environment”, journal of medical systems, vol. 40, no. 9, p. 200, 2016.

S. C. Mukhopadhyay, “Wearable sensors for human activity monitoring: A review”, IEEE Sensors Journal, vol. 15, p. 1321–1330, 2015.

A. Reiss and D. Stricker, “Introducing a new benchmarked dataset for activity monitoring”, de International Symposium on Wearable Computers, 2012.

W. H. Wu, A. A. Bui, M.A. Batalin et al, “MEDIC: medical embedded device for individualized care”, Artificial Intelligence in Medicine, vol. 42, no. 2, pp. 137-152, 2008.

E. V. Someren, B. Vonk, W. Thijssen, J. Speelman et al, “A new actigraph for long-term registration of the duration and intensity of tremor and movement”, Biomedical Engineering, vol. 45, no. 3, pp. 386- 395, 1998.

D. J. Walker, P. S. Heslop, C. J. Plummer, et al, “A continuous patient activity”, Physiological Measurement, vol. 18, no. 1, pp. 49-59, 1997.

N. Hu, Z. Lou, G. Englebienne and B. Kröse, B., “Learning to Recognize Human Activities from Soft Labeled Data”, de Robotics: Science and Systems X, Berkeley, 2014.

G. Wu and S. Xue, “Portable preimpact fall detector with inertial sensors”, Neural Systems and Rehabilitation Engineering IEEE Transactions on, vol. 16, no. 2, p. 178–183, 2008.

H. J. Busser, J. Ott, R. C. van Lummel et al, “Ambulatory monitoring of children’s activity”, Medical Engineering & Physics, vol. 19, no. 5, pp. 440-445, 1997.

B. G. Steele, B. Belza, K. Cain, C. Warms, “Bodies in motion: Monitoring daily activity and exercise with motion sensors in people with chronic pulmonary disease”, Rehabilitation Research and Development, vol. 40, no. 5, 2003.

S. Bosch, M. Marin-Perianu, et al, “Keep on moving! activity monitoring and stimulation using wireless sensor networks”, de European Conference on Smart Sensing and Context, 2009.

F. Chen, Q. Zhong and F. Cannella, “Hand gesture modeling and recognition for human and robot interactive assembly using hidden markov models”, International Journal of Advanced Robotic Systems, vol. 12, no. 4, p. 48, 2015.

Cómo citar

APA

Ovalles-Pabón, F. O. (2022). State of the art on technological trends for the analysis of behavior and human activities. Visión electrónica, 16(2). https://revistas.udistrital.edu.co/index.php/visele/article/view/20695

ACM

[1]
Ovalles-Pabón, F.O. 2022. State of the art on technological trends for the analysis of behavior and human activities. Visión electrónica. 16, 2 (nov. 2022).

ACS

(1)
Ovalles-Pabón, F. O. State of the art on technological trends for the analysis of behavior and human activities. Vis. Electron. 2022, 16.

ABNT

OVALLES-PABÓN, Freddy Oswaldo. State of the art on technological trends for the analysis of behavior and human activities. Visión electrónica, [S. l.], v. 16, n. 2, 2022. Disponível em: https://revistas.udistrital.edu.co/index.php/visele/article/view/20695. Acesso em: 28 abr. 2024.

Chicago

Ovalles-Pabón, Freddy Oswaldo. 2022. «State of the art on technological trends for the analysis of behavior and human activities». Visión electrónica 16 (2). https://revistas.udistrital.edu.co/index.php/visele/article/view/20695.

Harvard

Ovalles-Pabón, F. O. (2022) «State of the art on technological trends for the analysis of behavior and human activities», Visión electrónica, 16(2). Disponible en: https://revistas.udistrital.edu.co/index.php/visele/article/view/20695 (Accedido: 28 abril 2024).

IEEE

[1]
F. O. Ovalles-Pabón, «State of the art on technological trends for the analysis of behavior and human activities», Vis. Electron., vol. 16, n.º 2, nov. 2022.

MLA

Ovalles-Pabón, Freddy Oswaldo. «State of the art on technological trends for the analysis of behavior and human activities». Visión electrónica, vol. 16, n.º 2, noviembre de 2022, https://revistas.udistrital.edu.co/index.php/visele/article/view/20695.

Turabian

Ovalles-Pabón, Freddy Oswaldo. «State of the art on technological trends for the analysis of behavior and human activities». Visión electrónica 16, no. 2 (noviembre 26, 2022). Accedido abril 28, 2024. https://revistas.udistrital.edu.co/index.php/visele/article/view/20695.

Vancouver

1.
Ovalles-Pabón FO. State of the art on technological trends for the analysis of behavior and human activities. Vis. Electron. [Internet]. 26 de noviembre de 2022 [citado 28 de abril de 2024];16(2). Disponible en: https://revistas.udistrital.edu.co/index.php/visele/article/view/20695

Descargar cita

Visitas

3

Descargas

Los datos de descargas todavía no están disponibles.
Loading...