Media coverage of risks and responsibility in the automation debate

The rapid diffusion of algorithms, robots and artificial intelligence is accompanied by public debates about applications and opportunities as well as emerging risks such as the loss of jobs, manipulation, security threats and violations of privacy. This raises questions regarding responsibility for damage and problematic impacts of automation. Media reporting on risks and responsibilities plays a central role in shaping public perception of automation. The project therefore investigated the media coverage of risks and responsibilities in the automation discourse in general and differences in the coverage of social media algorithms and social robotics in particular.

The project was funded by the ÖAW’s Go!Digital Next Generation program and carried out by an interdisciplinary consortium of researchers from the Institute of Comparative Media and Communication Studies (CMC), the Austrian Centre for Digital Humanities and Cultural Heritage (ACDH-CH) and the Institute of Philosophy of the University of Vienna. The project team developed an innovative research approach based on the concept of “responsibility networks” from the philosophy of technology combined with manual content analyses and automated text analyses of media reporting on automation issues.

Amount, trends, and tone of media reporting on automation …

The results of the content analyses show that the amount of media reporting on automation increases over time and is spread across more than 30 application areas. Automation in the industry (e.g., Industry 4.0) is reported most frequently, followed by automation in social interaction (e.g., social companions). The overall tenor of the reporting on automation is positive, although almost 40% of all articles also address risks. Compared to risks, questions of responsibility are rarely raised. Only every seventh article that deals with automation addresses responsibility for problems or solutions.

Differences in coverage of risks and responsibility for social media and social robotics …

In-depth analyses of the responsibility discourse show that the way of reporting differs according to automation technologies and fields of application. The media coverage of social media and internet algorithms is more critical than the coverage of robotics and social companions. Challenges related to social companions and social robotics are presented primarily as social problems, articulated primarily by experts, and seen as challenges for society, economy and research. When it comes to social media, the focus is on concerns regarding the public sphere, opinion-formation, and data protection, which are primarily articulated by journalists, who clearly assign the greatest responsibility to the Internet platforms.

Limits of automated media content analysis …

In terms of methods, the project showed to what extent the responsibility discourses in the media can be recorded by manual and automated media content analyses. In this regard, the project reveals, that the tonality of media reports as well as the thematization of risks and responsibility can be analysed with manual and automated content analyses. Individual risk types and the specific elements of responsibility relations (speakers, objects, subjects) can be recorded well with manual content analysis. Responsibility networks also include norms, addressees and instances of responsibility. Media content analyses demonstrate, however, that these relational elements are often not explicitly mentioned in media reporting, but only represent implicit points of reference in the media discourse. Hence, they are better suited for qualitative content analyses that allow an adequate interpretation of the context.



Media reporting on algorithms, robotics and artificial intelligence: Representation of risks and responsibility in the automation debate (MARA