Desing and methodological process for assessing quasi-experiments in virtual reality environments for deepfake recognition in the artificial intelligence era

人工智能时代虚拟现实环境中用于深度伪造识别的准实验评估的设计和方法论流程

阅读:1

Abstract

Nowadays, the impact of artificial intelligence tools in the professional field must be analyzed, as well as their influence on the field of journalism and information. One of the aspects that has generated most concern in this area is the use of these tools, which can generate audiovisual content, for the creation of deepfakes. This article presents the methodology used to carry out a quasi-experiment designed to study and analyze the behaviour of young people in the face of possible exposure to deepfakes generated with artificial intelligence tools, as well as their ability to identify them. The experiment is conducted in a virtual environment in which participants are immersed in an interact with the environment in which they visualize newspaper front pages that include contextual elements. Participants must review the information included in the virtual environment to determine whether the images displayed correspond to real people or people generated with artificial intelligence tools. In addition, the influence and importance of the contextual elements accompanying an image in determining whether it is fake or real is analyzed. This article aims to detail the methodology used in this experiment to promote its replicability.•This article proposes the method of a detailed guide to be replicated and reproduced in future academic research to understand the media diet of different population groups.•Datasets are provided with results that allow for comparative, longitudinal and replication studies.•The A-Frame framework for the design of virtual environments is introduced and can be used for the design of quasi-experiments.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。