An algorithm for the unemployed? Socio-technical analysis of the so called “AMS-algorithm” of the Austrian Public Employment Service (AMS)

Starting in 2021, a new semi-automated assistance systems (short AMAS) is supposed to calculate the future chances of job seekers on the labour market. On the basis of past statistics, job seekers will be classified into three groups, to which different resources for further education are allocated. However, as this study shows, the AMS-algorithm has far-reaching consequences for jobseekers, AMS staff and the AMS as a public service institution.

The so-called "AMS algorithm" is controversial. Critical voices speak of an algorithmic manifestation of discrimination on the labour market. The Institute for Technology Assessment (ITA) of the Austrian Academy of Sciences analysed the technical functioning and social impacts together with the Technical University of Vienna on behalf of the Chamber of Labour.

Background: Johannes Kopf, member of the AMS Board, made a remarkable statement when announcing this large-scale changeover: "Our new assistance system takes this reality into account, but logically it cannot discriminate itself.” (read more) This statement reflects the myth of technology as a value-neutral tool, although social science research into technology has already shown that the design and use of technology always anchors certain values, norms and interests in society. Big-data analyses are also surrounded by an aura of truth, objectivity and accuracy that is not empirically proven, but is nevertheless reproduced again and again in public debates. These two misconceptions also accompanied the gradual introduction of the AMS-algorithm - which is currently stopped due to a ruling by the data protection authority.

How it works: Based on statistics from past years, the future chances of job seekers on the labour market are calculated. The job seekers are classified into three groups based on the forecast of their "integration chances", to which different resources for further education are allocated. The algorithmic system looks for connections between job seeker characteristics and successful employment. The characteristics include age, group of countries, gender, education, care obligations and health impairments as well as past employment, contacts with the AMS and the labour market situation in the place of residence. The aim is to invest primarily in those jobseekers for whom the support measures are most likely to lead to reintegration into the labour market.

Research questions


The above considerations are venture points for our analysis of the AMS algorithm from a socio-technical perspective. We investigate the following questions:

  • What are AMAS' objectives at organisational and operational level? Which guiding principles inform the technical design of the system?
  • How was AMAS technically implemented? Which data on past occupational histories are used and what components does the AMS algorithm consist of?
  • Which forms of bias, discrimination and error rates play a role, and how were these specifically taken into account?
  • What effects can the use of the system have on current practice of the employment service and on job seekers?
  • What recommendations regarding bias and discrimination, transparency and political steering processes and governance institutions does this analysis entail? Or: are derived from it?

Approach


The project investigated socio-technical dimensions of the AMS-algorithm based on current research, specifically from the Critical Data Studies and the area of Fairness, Accountability and Transparency in Sociotechnical Systems. By analysing internal and publicly published documents of the AMS, the conceptual, technical and social implementation of this system was investigated. In addition, comparative studies on similar systems in other countries were used.

Key results


The computer-generated “chance of integration” (IC value) was only intended to provide the AMS with an additional function in the care of jobseekers. However, as this study shows, the AMS-algorithm has far-reaching consequences for job seekers, AMS staff and the AMS organisation:

  1. an increase in the efficiency of the counselling process is only associated with a predominantly routine adoption of the computer-generated classification. This is in line with the general trend towards rationalising services, but is not conducive to the service orientation of a semi-governmental institution that fulfils its public mandate to increase job seekers' chances.
  2. The increase in "training effectiveness" by concentrating funding in the middle segment is not primarily aimed at the accuracy and quality of the measure for an individual AMS customer, but combines the objective of "effective" use of funds with a rough division into three customer groups.
  3. In the development of the system, hardly any procedures were used to avoid bias in the system and the system does not offer any indications in its application to prevent possible structural inequalities in treatment. Gender equality aspects have traditionally played an important role in the AMS support process, which can run counter to an objectified, numerical (and not bias-optimised) classification of integration opportunities.

Recommendations


The development of algorithmic systems for (semi-)governmental institutions such as the AMS, requires anti-discrimination measures, as well as system and data transparency in order to enable a comprehensive evaluation from a technical, fundamental rights, democratic and rule-of-law perspective.

In addition, the study recommends rights of access and objection for those affected, public consultations, and the teaching of new skills for AMS advisors and customers in case that algorithmic systems are used in the public sector.

 

The project was carried out in interdisciplinary cooperation with Florian Cech and Fabian Fischer (Centre for Informatics & Society, TU Vienna) and Gabriel Grill (PhD student at the School of Information, University of Michigan).

Publications

Publications

  • Riedlinger, D. (2022). Wie diskriminierend sind Computersysteme?. Ita-Newsfeed. Retrieved from https://www.youtube.com/watch?v=k0ueTXsVcOM
  • ITA [Hrsg.],. (2021). How fair is the AMS Algorithm? ITA-Dossier no. 52en (March 2021; Authors: Astrid Mager, Doris Allhutter). Wien. doi:10.1553/ita-doss-052en
  • Riedlinger, D. (2021). Covid-19 Live-Diskussion. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/ita/detail/news/covid-19-live-diskussion
  • Allhutter, D. (2021). Ein Algorithmus zur effizienten Förderung der Chancen auf dem Arbeitsmarkt?. Wiso – Zeitschrift Für Sozial- Und Wirtschaftswissenschaften, 44.JG, 82-95. Retrieved from https://www.isw-linz.at/fileadmin/user_upload/HP_Allhutter.pdf
  • Riedlinger, D. (2021). Das ITA in den Medien. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/ita/detail/news/das-ita-in-den-medien
  • Riedlinger, D. (2021). ITA-Forscherin Doris Allhutter erhält Käthe-Leichter Preis. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/ita/detail/news/ita-forscherin-doris-allhutter-erhaelt-kaethe-leichter-preis
  • Riedlinger, D. (2021). Video-Diskussion: Algorithmen – Freund oder Feind?. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/ita/detail/news/video-diskussion-algorithmen-freund-oder-feind
  • ITA [Hrsg.],. (2021). Wie fair ist der AMS-Algorithmus? ITA-Dossier Nr. 52 (Jänner 2021; AutorInnen: Astrid Mager, Doris Allhutter). Wien. doi:10.1553/ita-doss-052
  • Riedlinger, D. (2020). Woher haben Maschinen ihren Hausverstand?. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/detail/news/woher-haben-maschinen-ihren-hausverstand
  • Riedlinger, D. (2020). Interview: Wie fair ist der AMS-Algorithmus?. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/detail/news/wie-fair-sind-algorithmen
  • Riedlinger, D. (2020). Die neue ITA-Studie zum AMS-Algorithmus im Überblick. Ita-Newsfeed. Retrieved from https://www.oeaw.ac.at/ita/projekte/der-ams-algorithmus
  • Allhutter, D., Mager, A., Cech, F., Fischer, F., & Grill, G. (2020). Der AMS Algorithmus - Eine Soziotechnische Analyse des Arbeitsmarktchancen-Assistenz-Systems (AMAS) (p. 120). Wien. doi:/10.1553/ITA-pb-2020-02
  • Allhutter, D., Cech, F., Fischer, F., Grill, G., & Mager, A. (2020). Algorithmic Profiling of Job Seekers in Austria: How Austerity Politics Are Made Effective. Frontiers In Big Data, Special Issue Critical Data and Algorithm Studies, 17. doi:10.3389/fdata.2020.00005
  • ITA [Hrsg.],. (2020). AMS Algorithm on trial. ITA-Dossier no. 43en (February 2020; Authors: Doris Allhutter, Fabian Fischer, Astrid Mager). Wien. doi:10.1553/ita-doss-043en
  • ITA [Hrsg.],. (2019). AMS-Algorithmus am Prüfstand. ITA-Dossier Nr. 43 (Juli 2019; AutorInnen: Doris Allhutter, Fabian Fischer, Astrid Mager). Wien. doi:10.1553/ita-doss-043
  • 1

Conference Papers/Speeches

Conference Papers/Speeches

  • 10/11/2022 , Nürnberg
    Doris Allhutter: 
    Kumulative Benachteiligung im Arbeitsmarktchancen-Assistenz-System (AMAS)
    Invited Talk at the Colloquium "Data Analytics and Machine Leanring"
    Other Invited Lecture
  • 18/02/2022 , München
    Doris Allhutter: 
    Social Inequality in Austria’s Employment Prospects Assistance System
    Roundtable „Next Generation AI“ – Social Aspects of AI
    Other Invited Lecture
  • 18/02/2022 , Berlin
    Doris Allhutter: 
    Diskriminierungsrisiken von Frauen im Arbeitsmarktchancen-Assistenzsystem (AMAS)
    Auswirkungen von algorithmischen Systemen auf die Arbeitsmarkchancen von Frauen
    Other Invited Lecture
  • 13/12/2021 , Vienna
    Astrid Mager: 
    Digital Technologies and Labour Market Policy: the AMS Algorithm
    Ringvorlesung Governing Algorithms
    Other Invited Lecture
  • 10/10/2021 , Salzburg
    Doris Allhutter: 
    Citizen Profiling in the Public Sector: Reframing the Relationship between Citizens and the State
    Symposium Ethics & Sustainability 2021
    Other Invited Lecture
  • 23/06/2021 , Salzburg
    Doris Allhutter: 
    Effizienter Sozialabbau durch Algorithmen?
    Lecture series "Macht.Gesellschaft.Hochschule"
    Other Invited Lecture
  • 25/03/2021 , Berlin (online)
    Doris Allhutter: 
    How Algorithms for Job Seekers shape the relationship between citizens and the state.
    Workshop on Software Tools and Algorithms for Jobseekers
    Other Invited Lecture
  • 09/03/2021 , Stockholm (online)
    Doris Allhutter: 
    How Austerity Politics Are Made Effective: Algorithmic Profiling of Job Seekers in Austria.
    Swedish Network on Automated Decision-Making in the Public Sector (Keynote)
    Keynote
  • 21/01/2021 , Linz
    Doris Allhutter: 
    Algorithmic Welfare. Citizen Profiling in the Public Sector.
    JKU Lecture Series Artificial Intelligence
    Other Invited Lecture
  • 16/10/2020 , online/Hallstadt
    Doris Allhutter,  Astrid Mager: 
    Algorithmic Welfare: BürgerInnen-Profiling im Wohlfahrtsstaat
    ‘Locating and Timing Matters: Significance and Agency of STS in Emerging Worlds
    Other Lecture
  • 18/08/2020 , Prague
    Doris Allhutter,  Astrid Mager: 
    Algorithmic profiling of job seekers in Austria: how to make austerity politics effective
    ‘Locating and Timing Matters: Significance and Agency of STS in Emerging Worlds
    Other Lecture
  • 04/11/2019 , Bratislava
    Doris Allhutter,  Astrid Mager: 
    Algorithmic profiling of job seekers in Austria: how to make austerity politics effective
    Fourth European Technology Assessment Conference
    Other Lecture
  • 1

Duration

08/2019 - 10/2020

Project team

Funding

Partners