Show simple item record

Basic human–robot interaction system running on an embedded platform

dc.contributor.authorVega, Julio
dc.date.accessioned2024-02-01T07:58:54Z
dc.date.available2024-02-01T07:58:54Z
dc.date.issued2021-07-21
dc.identifier.citationJulio Vega. Basic human–robot interaction system running on an embedded platform. Microprocessors and Microsystems, 85:104316--104323, July 2021es
dc.identifier.issn0141-9331
dc.identifier.urihttps://hdl.handle.net/10115/29394
dc.description.abstractRobotics will be a dominant area in society throughout future generations. Its presence is currently increasing in most daily life settings, with devices and mechanisms that facilitate the accomplishment of diverse tasks, as well as in work scenarios, where machines perform more and more jobs. This increase in the presence of autonomous robotic systems in society is due to their great efficiency and security compared to human capacity, which is thanks mainly to the enormous precision of their sensor and actuator systems. Among these, vision sensors are of the utmost importance. Humans and many animals naturally enjoy powerful perception systems, but, in robotics, this constitutes a constant line of research. In addition to having a high capacity for reasoning and decision-making, these robots incorporate important advances in their perceptual systems, allowing them to interact effectively in the working environments of this new industrial revolution. Drawing on the most basic interaction between humans, looking at the face, an innovative system is presented in this paper, which was developed for an autonomous and DIY robot. This system is composed of three modules. First, the face detection component, which detects human faces in the current image. Second, the scene representation algorithm, which offers a wider field of view than that of the single camera used, mounted on a servo-pan unit. Third, the active memory component, which was designed and implemented according to two competing dynamics: life and salience. The algorithm intelligently moves the servo-pan unit with the aim of finding new faces, follow existing ones and forgetting those that no longer appear on the scene. The system was developed and validated using a low-cost platform based on a Raspberry Pi3 board.es
dc.language.isoenges
dc.publisherElsevieres
dc.rightsAttribution 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectKeywords: Python Low-cost Raspberry Pi Visual attention Face tracking Human–robot interactiones
dc.titleBasic human–robot interaction system running on an embedded platformes
dc.typeinfo:eu-repo/semantics/articlees
dc.identifier.doi10.1016/j.micpro.2021.104316es
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses


Files in this item

This item appears in the following Collection(s)

Show simple item record

Attribution 4.0 InternacionalExcept where otherwise noted, this item's license is described as Attribution 4.0 Internacional