[Elenco soci]

Generosi Andrea


Università Politecnica delle Marche

Sito istituzionale
SCOPUS ID: 57201216842
Orcid: 0000-0001-6173-9616

Pubblicazioni scientifiche

[1] Carulli M., Generosi A., Bordegoni M., Mengoni M., Design of XR Applications for Museums, Including Technology Maximising Visitors’ Experience, Lecture Notes in Mechanical Engineering, 1460-1470, (2023). Abstract

Abstract: eXtended Reality (XR) technology can enhance the visitors’ experience of museums. Due to the variety of XR technologies available that differ in performance, quality of the experience they provide, and cost, it is helpful to refer to the evaluation of the various technologies performed through user studies to select the most suitable ones. This paper presents a set of empirical studies on XR application for museums to select the appropriate technologies to meet visitors’ expectations and maximise the willingness of repeating and recommending the experience. They provide valuable insights for developing Virtual Museum applications increasing the level of presence and experience economy.

Keywords: Extended reality | Multisensory experience | Sense of smell | User experience | Virtual museum

[2] Generosi A., Ceccacci S., D’Angelo I., Del Bianco N., Cimini G., Mengoni M., Giaconi C., Emotion Analysis Platform to Investigate Student-Teacher Interaction, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 13309 LNCS, 35-48, (2022). Abstract

Abstract: This paper introduces a system that enable the collection of relevant data related to the emotional behavior and attention of both student and professor during exams. It exploits facial coding techniques to enable the collection of a large amount of data from the automatic analysis of students and professors faces using video analysis, advanced techniques for gaze tracking based on deep Learning, and technologies and the principles related to the Affective Computing branch derived from the research of Paul Ekman. It provides tools that facilitates the interpretation of the collected data by means of a dashboard. A preliminary experiment has been carried out to investigate whether such a system may help in assessing the evaluation setting and support reflection on the evaluation processes in the light of the different situations, so as to improve the adoption of inclusive approaches. Results suggest that information provided by the proposed system can be helpful in assessing the setting and the evaluation process.

Keywords: Affective computing | Deep learning | E-leaning | Emotion recognition | Gaze tracking

[3] Ceccacci S., Generosi A., Mengoni M., A System to Support the Design and Management of Customer Experience Based on a Customer-Centered Approach, Lecture Notes in Mechanical Engineering, 753-760, (2022). Abstract

Abstract: This article introduces at a conceptual level a system based on AI technologies, able to determine the customer profile, in order to support customer experience design and management accordingly to a customer-centered approach, by extracting information from video stream provided by the security cameras installed in a store. The system collects customer demographic and behavioral information (e.g., age, gender, time spent in determined areas of the store, time spent interacting with the salesperson, etc.) through Deep Learning algorithms, in a completely anonymous way, without saving bio-metric data. To predict the customer profile based on the collected data it exploits a Bayesian Belief Network (BBN). The paper describes the overall system architecture, details the method used to model the BBN and reports, through the description of a use case scenario, some examples of insights useful to guide the choice of possible actions to be taken to improve the customer experience strategy.

Keywords: Customer experience | Customer profiling | Machine learning | Predictive models | Video analysis

[4] Generosi A., Agostinelli T., Ceccacci S., Mengoni M., A novel platform to enable the future human-centered factory, International Journal of Advanced Manufacturing Technology, (2022). Abstract

Abstract: This paper introduces a web-platform system that performs semi-automatic compute of several risk indexes, based on the considered evaluation method (e.g., RULA—Rapid Upper Limb Assessment, REBA—Rapid Entire Body Assessment, OCRA—OCcupational Repetitive Action) to support ergonomics risk estimation, and provides augmented analytics to proactively improve ergonomic risk monitoring based on the characteristics of workers (e.g., age, gender), working tasks, and environment. It implements a body detection system, marker-less and low cost, based on the use of RGB cameras, which exploits the open-source deep learning model CMU (Carnegie Mellon University), from the tf-pose-estimation project, assuring worker privacy and data protection, which has been already successfully assessed in standard laboratory conditions. The paper provides a full description of the proposed platform and reports the results of validation in a real industrial case study regarding a washing machine assembly line composed by 5 workstations. A total of 15 workers have been involved. Results suggest how the proposed system is able to significantly speed up the ergonomic assessment and to predict angles and perform a RULA and OCRA analysis, with an accuracy comparable to that obtainable from a manual analysis, even under the unpredictable conditions that can be found in a real working environment.

Keywords: Ergonomics risk assessment | Extended reality | Human-centered manufacturing | Machine learning | Motion capture

[5] Generosi A., Ceccacci S., Tezçi B., Montanari R., Mengoni M., Nudges-Based Design Method for Adaptive HMI to Improve Driving Safety, Safety, 8(3), (2022). Abstract

Abstract: This study introduces a new operational tool based on the AEIOU observational framework to support the design of adaptive human machine interfaces (HMIs) that aim to modify people’s behavior and support people’s choices, to improve safety using emotional regulation techniques, through the management of environmental characteristics (e.g., temperature and illumination), according to an approach based on the nudging concept within a design thinking process. The proposed approach focuses on research in the field of behavioral psychology that has studied the correlations between human emotions and driving behavior, pushing towards the elicitation of those emotions judged to be most suitable for safe driving. The main objective is to support the ideation of scenarios and/or design features for adaptive HMIs to implement a nudging strategy to increase driving safety. At the end, the results from a collaborative workshop, organized as a case study to collect concept ideas in the context of sports cars, will be shown and evaluated to highlight the validity of the proposed methodology, but also the limitations due to the requirement of prototypes to evaluate the actual effectiveness of the presented nudging strategies.

Keywords: adaptive HMI | affective computing | automotive | driving safety | emotion regulation | nudge

[6] Generosi A., Agostinelli T., Mengoni M., Smart retrofitting for human factors: a face recognition-based system proposal, International Journal on Interactive Design and Manufacturing, (2022). Abstract

Abstract: Industry nowadays must deal with the so called “fourth industrial revolution”, i.e. Industry 4.0. This revolution is based on the introduction of new paradigms in the manufacturing industry such as flexibility, efficiency, safety, digitization, big data analysis and interconnection. However, human factors’ integration is usually not considered, although included as one of the paradigms. Some of these human factors’ most overlooked aspects are the customization of the worker’s user experience and on-board safety. Moreover, the issue of integrating state of the art technologies on legacy machines is also of utmost importance, as it can make a considerable difference on the economic and environmental aspects of their management, by extending the machine’s life cycle. In response to this issue, the Retrofitting paradigm, the addition of new technologies to legacy machines, has been considered. In this paper we propose a novel modular system architecture for secure authentication and worker’s log-in/log-out traceability based on face recognition and on state-of-the-art Deep Learning and Computer Vision techniques, as Convolutional Neural Networks. Starting from the proposed architecture, we developed and tested a device designed to retrofit legacy machines with such capabilities, keeping particular attention to the interface usability in the design phase, little considered in retrofitting applications along with other Human Factors, despite being one of the pillars of Industry 4.0. This research work’s results showed a dramatic improvement regarding machines on-board access safety.

Keywords: Face recognition | Industrial Internet of Things | Industry 4.0 | Safety | Smart retrofitting | Usability

[7] Agostinelli T., Generosi A., Ceccacci S., Khamaisi R.K., Peruzzini M., Mengoni M., Preliminary validation of a low-cost motion analysis system based on rgb cameras to support the evaluation of postural risk assessment, Applied Sciences (Switzerland), 11(22), (2021). Abstract

Abstract: This paper introduces a low-cost and low computational marker-less motion capture system based on the acquisition of frame images through standard RGB cameras. It exploits the open-source deep learning model CMU, from the tf-pose-estimation project. Its numerical accuracy and its usefulness for ergonomic assessment are evaluated by a proper experiment, designed and per-formed to: (1) compare the data provided by it with those collected from a motion capture golden standard system; (2) compare the RULA scores obtained with data provided by it with those obtained with data provided by the Vicon Nexus system and those estimated through video analysis, by a team of three expert ergonomists. Tests have been conducted in standardized laboratory conditions and involved a total of six subjects. Results suggest that the proposed system can predict angles with good consistency and give evidence about the tool’s usefulness for ergonomist.

Keywords: Ergonomic risk assessment | Industrial ergonomics | Motion capture | Postural analysis | RULA

[8] Ceccacci S., Generosi A., Leopardi A., Mengoni M., Mandorli F., The Role of Haptic Feedback and Gamification in Virtual Museum Systems, Journal on Computing and Cultural Heritage, 14(3), (2021). Abstract

Abstract: This article reports the results of a research aimed to evaluate the ability of a haptic interface to improve the user experience (UX) with virtual museum systems. In particular, two user studies have been carried out to (1) compare the experience aroused during the manipulation of a 3D printed replica of an artifact with a pen-like stylus with that aroused during the interaction (visual and tactile) with a 3D rendering application using a haptic interface and PC monitor, and (2) compare the users' perceived usability and UX among a traditional mouse-based desktop interface, haptic interface, and haptic gamified interface based on the SUS scale and the AttrakDiff2 questionnaire. A total of 65 people were involved. The considered haptic application is based on the haptic device Omega 6 produced by Force Dimension, and it is a permanent attraction of the Museo Archeologico Nazionale delle Marche. Results suggest that the proposed haptic interface is suitable for people who commonly use mouse-based computer interaction, but without previous experience with haptic systems, and provide some insights useful to better understand the role of haptic feedback and gamification in enhancing UX with virtual museums, and to guide the development of other similar applications in the future.

Keywords: haptic interface | user experience | Virtual museum | virtual reality

[9] Ceccacci S., Mengoni M., Generosi A., Giraldi L., Presta R., Carbonara G., Castellano A., Montanari R., Designing in-car emotion-aware automation, European Transport - Trasporti Europei, (2021). Abstract

Abstract: Driver behaviour recognition is of paramount importance for in-car automation assistance. It is widely recognized that not only attentional states, but also emotional ones have an impact on the safety of the driving behaviour. This research work proposes an emotion-aware in-car architecture where it is possible to adapt driver’s emotions to the vehicle dynamics, investigating the correlations between negative emotional states and driving performances, and suggesting a system to regulate the driver’s engagement through a unique user experience (e.g. using music, LED lighting) in the car cabin. The relationship between altered emotional states induced through auditory stimuli and vehicle dynamics is investigated in a driving simulator. The results confirm the need for both types of information to improve the robustness of the driver state recognition function and open up the possibility that auditory stimuli can modify driving performance somehow.

Keywords: Driver monitoring system | Emotion recognition | Facial expression recognition

[10] Ceccacci S., Generosi A., Cimini G., Faggiano S., Giraldi L., Mengoni M., Facial coding as a mean to enable continuous monitoring of student’s behavior in e-Learning, CEUR Workshop Proceedings, 2817, (2021). Abstract

Abstract: This paper introduces an e-learning platform for the management of courses based on MOOCs, able to continuously monitoring student’s behavior through facial coding techniques, with a low computational effort client-side, and to provide useful insight for the instructor. The system exploits the most recent developments in Deep Learning and Computer Vision for Affective Computing, in compliance with the European GDPR. Taking as input the video capture by the webcam of the device used to attend the course, it: (1) performs continuous student’s authentication based on face recognition, (2) monitors the student’s level of attention through head orientation tracking and gaze detection analysis, (3) estimates student’s emotion during the course attendance. The paper describes the overall system design and reports the results of a preliminary survey, which involved a total of 14 subjects, aimed at investigating user acceptance, in terms of intention to continue using such a system.

Keywords: Affective Computing | Deep Learning | E-leaning | Facial Coding | Facial Recognition

[11] Generosi A., Ceccacci S., Faggiano S., Giraldi L., Mengoni M., A toolkit for the automatic analysis of human behavior in HCI applications in the wild, Advances in Science, Technology and Engineering Systems, 5(6), 185-192, (2020). Abstract

Abstract: Nowadays, smartphones and laptops equipped with cameras have become an integral part of our daily lives. The pervasive use of cameras enables the collection of an enormous amount of data, which can be easily extracted through video images processing. This opens up the possibility of using technologies that until now had been restricted to laboratories, such as eye-tracking and emotion analysis systems, to analyze users' behavior in the wild, during the interaction with websites. In this context, this paper introduces a toolkit that takes advantage of deep learning algorithms to monitor user's behavior and emotions, through the acquisition of facial expression and eye gaze from the video captured by the webcam of the device used to navigate the web, in compliance with the EU General data protection regulation (GDPR). Collected data are potentially useful to support user experience assessment of web-based applications in the wild and to improve the effectiveness of e-commerce recommendation systems.

Keywords: Affective Computing | Convolutional Neural Networks | Deep Learning | Gaze detection | User Experience

[12] Ceccacci S., Mengoni M., Andrea G., Giraldi L., Carbonara G., Castellano A., Montanari R., A preliminary investigation towards the application of facial expression analysis to enable an emotion-aware car interface, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12189 LNCS, 504-517, (2020). Abstract

Abstract: The paper describes the conceptual model of an emotion-aware car interface able to: map both the driver’s cognitive and emotional states with the vehicle dynamics; adapt the level of automation or support the decision-making process if emotions negatively affecting the driving performance are detected; ensure emotion regulation and provide a unique user experience creating a more engaging atmosphere (e.g. music, LED lighting) in the car cabin. To enable emotion detection, it implements a low-cost emotion recognition able to recognize Ekman’s universal emotions by analyzing the driver’s facial expression from stream video. A preliminary test was conducted in order to determine the effectiveness of the proposed emotion recognition system in a driving context. Results evidenced that the proposed system is capable to correctly qualify the drivers’ emotion in a driving simulation context.

Keywords: Driver Monitoring System | Emotion recognition | Facial expression recognition

[13] Talipu A., Generosi A., Mengoni M., Giraldi L., Evaluation of Deep Convolutional Neural Network architectures for Emotion Recognition in the Wild, 2019 IEEE 23rd International Symposium on Consumer Technologies, ISCT 2019, 25-27, (2019). Abstract

Abstract: This paper presents a software based on an innovative Convolutional Neural Network model to recognize the six Ekman's universal emotions from the photos of human faces captured in the wild. The CNN was trained using three different datasets already labeled and merged after making them homogeneous. A comparison among different types of CNN architectures using the Keras framework for Python language is proposed and the evaluation results are presented.

Keywords: convolutional neural network | deep learning | emotion recognition

[14] Altieri A., Ceccacci S., Ciabattoni L., Generosi A., Talipu A., Turri G., Mengoni M., An Adaptive System to Manage Playlists and Lighting Scenarios Based on the User's Emotions, 2019 IEEE International Conference on Consumer Electronics, ICCE 2019, (2019). Abstract

Abstract: This paper introduces a new system capable of adaptively managing multimedia contents (e.g. music, video clips, etc.) and lighting scenarios based on the detected user's emotional state. The system captures the emotion from the user's face expression mapping it into a 2D valence-arousal space where the multimedia content is mapped and matches them with lighting color. Results of preliminary tests suggest that the proposed system is able to detect the user's emotional state and manage proper music and light colors in a symbiotic way.

[15] Generosi A., Altieri A., Ceccacci S., Foresi G., Talipu A., Turri G., Mengoni M., Giraldi L., MoBeTrack: A Toolkit to Analyze User Experience of Mobile Apps in the Wild, 2019 IEEE International Conference on Consumer Electronics, ICCE 2019, (2019). Abstract

Abstract: MoBeTrack (Mobile Behaviour Tracking) is a toolkit for automated collection of data necessary to support User Experience (UX) assessment of mobile applications. In contrast to existing frameworks, it is able to collect user demographic information (i.e., age and gender), trace any user interaction and recognize user's emotions during the use of an application. An SDK for iOS allows to easily embedding the toolkit in every mobile application in a flexible and scalable way.

[16] Generosi A., Ceccacci S., Mengoni M., A deep learning-based system to track and analyze customer behavior in retail store, IEEE International Conference on Consumer Electronics - Berlin, ICCE-Berlin, 2018-September, (2018). Abstract

Abstract: The present work introduces an emotional tracking system to monitor Shopping Experience at different touchpoints in a retail store, based on the elaboration of the information extracted from biometric data and facial expressions. A preliminary test has been carried out to determine the system effectiveness in a real context regarding to emotion detection and customers' sex, age and ethnicity discrimination. To this end, information provided by the system have been compare with the results of a traditional video analysis. Results suggest that the proposed system can be effectively used to support the analysis of customer experience in a retail context.

Keywords: customer experience | emotion analysis | emotion tracking | face recognition | facial expression | shopping experience

[17] Ceccacci S., Generosi A., Giraldi L., Mengoni M., An emotion recognition system for monitoring shopping experience, ACM International Conference Proceeding Series, 102-103, (2018). Abstract

Abstract: The present work introduces an emotional tracking system to monitor Shopping Experience at different touchpoints in a store, based on the elaboration of the information extracted from biometric data and facial expressions. Preliminary tests suggest that the proposed system can be effectively used in a retail context.

Keywords: Customer experience | Emotion analysis | Emotion tracking | Shopping experience

[18] Ceccacci S., Generosi A., Giraldi L., Mengoni M., Tool to make shopping experience responsive to customer emotions, International Journal of Automation Technology, 12(3), 319-326, (2018). Abstract

Abstract: This research aims to develop a system that examines and reacts to the changing behaviors and emotions of individuals in order to improve their shopping experience. The system is able to track emotions in real time at different touchpoints in a store and control a set of networked devices to configure the sensing space and all provided services responsive to the cus-tomers’ needs. This paper describes the general approach adopted to design the overall system and illustrates in detail the module prototyped to understand the users’ emotions through the analysis of facial expressions.

Keywords: Context-aware computing | Emotion recognition | Methods for CX | Shopping experience

[19] Mengoni M., Ceccacci S., Generosi A., Leopardi A., Spatial Augmented Reality: An application for human work in smart manufacturing environment, Procedia Manufacturing, 17, 476-483, (2018). Abstract

Abstract: Spatial Augmented reality (SAR) represents a key technology for the development of smart manufacturing as it is barrier free, does not require the use of Head Mounted Displays or any other wearable devices and it fits most of the industrial constraints. The paper presents a novel SAR-based system to support the manual work in future smart factories. It conveys technical instructions during assembly, provides alerts in case of risks for humans' safety and finally identifies which postures can bring to muscoloscheletric problems if repeated. Experiments with 30 participants demonstrated the effectiveness of the proposed SAR-based system as compared LED monitor-based system and the overall achieved usability. The results proved that SAR technology improves the operators' performance with respect to a LED monitor-based system and that users well accept it. We found that SAR is more effective for difficult tasks than for simple ones.

Keywords: Augmented Reality | Ergonomic assessmnt | In-Situ projection | Musculoskeletal Disorders evaluation | task guidance

[20] Ceccacci S., Generosi A., Giraldi L., Mengoni M., An user-centered approach to design smart systems for people with dementia, IEEE International Conference on Consumer Electronics - Berlin, ICCE-Berlin, 2017-September, 273-278, (2017). Abstract

Abstract: This study describes an User-Centered approach to design an User Interface (UI) to support daily activities of people with dementia. Such interface is the main hub of a home automation system able to monitor the house and reminds to the users some information when they approach the door to leave the home. In order to involve end users in UI evaluation at the end of the first stage of the design process, a specific experimental protocol, based on task analysis, structural interview, and behavioral observation, is defined. It allows to evaluate user-machine interaction considering aspects related to both adequacy of product feature and user's subjective opinion and behavior. A disposable high fidelity prototype of the UI is realized by using a touch screen tablet. Two tests, respectively dedicated to verify the adequacy of the icons and the understandability of the interface, are performed. A total of 20 subjects with different MMSE score are involved. Results show that people with low and medium dementia are able to understand and use the touch interface and provide some suggestion about how the GUI can be improved. Finally, some approaches to support the future development activities and the next usability tests are discussed.

Keywords: Assistive Technology | Dementia | Human-Computer Interaction | Usability Evaluation