Mostrar el registro sencillo del ítem
New machine learning approaches for real-life human activity recognition using smartphone sensor-based data
dc.contributor.author | Garcia-Gonzalez, D. | * |
dc.contributor.author | Rivero Cebrián, Daniel | * |
dc.contributor.author | Fernández Blanco, Enrique | * |
dc.contributor.author | Luaces, M.R. | * |
dc.date.accessioned | 2025-09-12T11:45:56Z | |
dc.date.available | 2025-09-12T11:45:56Z | |
dc.date.issued | 2023 | |
dc.identifier.citation | Garcia-Gonzalez D, Rivero D, Fernandez-Blanco E, Luaces MR. New machine learning approaches for real-life human activity recognition using smartphone sensor-based data. Knowledge-Based Systems. 2023;262. | |
dc.identifier.issn | 0950-7051 | |
dc.identifier.other | https://portalcientifico.sergas.gal//documentos/63d5b3daf851ee1ba3e9ebd6 | |
dc.identifier.uri | http://hdl.handle.net/20.500.11940/21782 | |
dc.description.abstract | In recent years, mainly due to the application of smartphones in this area, research in human activity recognition (HAR) has shown a continuous and steady growth. Thanks to its wide range of sensors, its size, its ease of use, its low price and its applicability in many other fields, it is a highly attractive option for researchers. However, the vast majority of studies carried out so far focus on laboratory settings, outside of a real-life environment. In this work, unlike in other papers, progress was sought on the latter point. To do so, a dataset already published for this purpose was used. This dataset was collected using the sensors of the smartphones of different individuals in their daily life, with almost total freedom. To exploit these data, numerous experiments were carried out with various machine learning techniques and each of them with different hyperparameters. These experiments proved that, in this case, tree-based models, such as Random Forest, outperform the rest. The final result shows an enormous improvement in the accuracy of the best model found to date for this purpose, from 74.39% to 92.97%. | |
dc.description.sponsorship | This research was partially funded by MCIN/AEI/10.13039/501100011033, NextGenerationEU/PRTR, FLATCITY-POC, Spain [grant number PDC2021-121239-C31]; MCIN/AEI/10.13039/501100011033 MAGIST, Spain [grant number PID2019-105221RB-C41]; Xunta de Galicia/FEDER-UE, Spain [grant numbers ED431G 2019/01, ED481A 2020/003, ED431C 2022/46, ED431C 2018/49 and ED431C 2021/53]. Funding for open access charge: Universidade da Coruna/CISUG. | |
dc.language | eng | |
dc.rights | Attribution 4.0 International (CC BY 4.0) | * |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | * |
dc.title | New machine learning approaches for real-life human activity recognition using smartphone sensor-based data | |
dc.type | Artigo | |
dc.authorsophos | Garcia-Gonzalez, D.; Rivero, D.; Fernandez-Blanco, E.; Luaces, M.R. | |
dc.identifier.doi | 10.1016/j.knosys.2023.110260 | |
dc.identifier.sophos | 63d5b3daf851ee1ba3e9ebd6 | |
dc.journal.title | Knowledge-Based Systems | * |
dc.organization | Instituto de Investigación Biomédica de A Coruña (INIBIC) | |
dc.organization | Instituto de Investigación Biomédica de A Coruña (INIBIC) | |
dc.relation.projectID | MCIN/AEI, NextGenerationEU/PRTR, FLATCITY-POC, Spain [PDC2021-121239-C31] | |
dc.relation.projectID | MCIN/AEI, MAGIST, Spain [PID2019-105221RB-C41] | |
dc.relation.projectID | Xunta de Galicia/FEDER-UE, Spain [ED431G 2019/01, ED481A 2020/003, ED431C 2022/46, ED431C 2018/49, ED431C 2021/53] | |
dc.relation.projectID | Universidade da Coruna/CISUG | |
dc.relation.publisherversion | https://doi.org/10.1016/j.knosys.2023.110260 | |
dc.rights.accessRights | openAccess | * |
dc.subject.keyword | INIBIC | |
dc.subject.keyword | INIBIC | |
dc.typefides | Artículo Científico (incluye Original, Original breve, Revisión Sistemática y Meta-análisis) | |
dc.typesophos | Artículo Original | |
dc.volume.number | 262 |
Ficheros en el ítem
Este ítem aparece en la(s) siguiente(s) colección(ones)
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution 4.0 International (CC BY 4.0)
