The computer-generated “chance of integration” (IC value) was only intended to provide the AMS with an additional function in the care of jobseekers. However, as this study shows, the AMS-algorithm has far-reaching consequences for job seekers, AMS staff and the AMS organisation:
The development of algorithmic systems for (semi-)governmental institutions such as the AMS, requires anti-discrimination measures, as well as system and data transparency in order to enable a comprehensive evaluation from a technical, fundamental rights, democratic and rule-of-law perspective.
In addition, the study recommends rights of access and objection for those affected, public consultations, and the teaching of new skills for AMS advisors and customers in case that algorithmic systems are used in the public sector.
The project was carried out in interdisciplinary cooperation with Florian Cech and Fabian Fischer (Centre for Informatics & Society, TU Vienna) and Gabriel Grill (PhD student at the School of Information, University of Michigan).
Starting in 2021, a new semi-automated assistance systems (short AMAS) is supposed to calculate the future chances of job seekers on the labour market. On the basis of past statistics, job seekers will be classified into three groups, to which different resources for further education are allocated. However, as this study shows, the AMS-algorithm has far-reaching consequences for jobseekers, AMS staff and the AMS as a public service institution.
The so-called "AMS algorithm" is controversial. Critical voices speak of an algorithmic manifestation of discrimination on the labour market. The Institute for Technology Assessment (ITA) of the Austrian Academy of Sciences analysed the technical functioning and social impacts together with the Technical University of Vienna on behalf of the Chamber of Labour.
Background: Johannes Kopf, member of the AMS Board, made a remarkable statement when announcing this large-scale changeover: "Our new assistance system takes this reality into account, but logically it cannot discriminate itself.” (read more) This statement reflects the myth of technology as a value-neutral tool, although social science research into technology has already shown that the design and use of technology always anchors certain values, norms and interests in society. Big-data analyses are also surrounded by an aura of truth, objectivity and accuracy that is not empirically proven, but is nevertheless reproduced again and again in public debates. These two misconceptions also accompanied the gradual introduction of the AMS-algorithm - which is currently stopped due to a ruling by the data protection authority.
How it works: Based on statistics from past years, the future chances of job seekers on the labour market are calculated. The job seekers are classified into three groups based on the forecast of their "integration chances", to which different resources for further education are allocated. The algorithmic system looks for connections between job seeker characteristics and successful employment. The characteristics include age, group of countries, gender, education, care obligations and health impairments as well as past employment, contacts with the AMS and the labour market situation in the place of residence. The aim is to invest primarily in those jobseekers for whom the support measures are most likely to lead to reintegration into the labour market.
The above considerations are venture points for our analysis of the AMS algorithm from a socio-technical perspective. We investigate the following questions:
The project investigated socio-technical dimensions of the AMS-algorithm based on current research, specifically from the Critical Data Studies and the area of Fairness, Accountability and Transparency in Sociotechnical Systems. By analysing internal and publicly published documents of the AMS, the conceptual, technical and social implementation of this system was investigated. In addition, comparative studies on similar systems in other countries were used.