Many people hold true to the belief that AI would cause more harm than good. Emotions and social norms in automations play an important role in improving human-automation interactions and increasing interest in technologies. In order to dissect the roles of these norms in automation interactions, we must first understand the role of emotions and social norms and applicably trust in human-human interactions. After investigating said areas, we concluded that emotions and social norms affect user behavior significantly, and implementing such behaviors in automation interactions would narrow the gap created by the presence of technology vs humans in said interactions. Post investigation of the factor of trust in human interactions, we deduced that promoting higher trust levels in human-robot interactions would increase human vulnerability when interacting with automations and consequently interest and engagement levels with presumed automation. Applying our findings to automations and committing to further research, we concluded that automations exhibiting certain socially normative behaviors would ensure the visible congruence to social and emotional norms and as a result be more trustworthy for users in human-automation interactions. Concluding from our research, in order to increase user trust levels and consequently user engagement and interest, we must implement emotions and social norms in automations. Exploring the possible applications of these findings, in the future we may apply non-verbal cues such as the mimicry of facial expressions and other (including verbal) cues to automations. If implemented, perhaps we may increase the worldwide trust level in automations and improve consumer experiences.
Research Paper by Anushka Jain, Grade 10, Thomas Jefferson High School for Science and Technology, VA