An algorithm for the unemployed?

Socio-technical analysis of the so called “AMS-algorithm” of the Austrian Public Employment Service (AMS)

Starting in 2021, a new semi-automated assistance systems (short AMAS) is supposed to calculate the future chances of job seekers on the labour market. On the basis of past statistics, job seekers will be classified into three groups, to which different resources for further education are allocated. However, as this study shows, the AMS-algorithm has far-reaching consequences for jobseekers, AMS staff and the AMS as a public service institution.

The so-called "AMS algorithm" is controversial. Critical voices speak of an algorithmic manifestation of discrimination on the labour market. The Institute for Technology Assessment (ITA) of the Austrian Academy of Sciences analysed the technical functioning and social impacts together with the Technical University of Vienna on behalf of the Chamber of Labour.

Background: Johannes Kopf, member of the AMS Board, made a remarkable statement when announcing this large-scale changeover: "Our new assistance system takes this reality into account, but logically it cannot discriminate itself.” (read more) This statement reflects the myth of technology as a value-neutral tool, although social science research into technology has already shown that the design and use of technology always anchors certain values, norms and interests in society. Big-data analyses are also surrounded by an aura of truth, objectivity and accuracy that is not empirically proven, but is nevertheless reproduced again and again in public debates. These two misconceptions also accompanied the gradual introduction of the AMS-algorithm - which is currently stopped due to a ruling by the data protection authority.

How it works: Based on statistics from past years, the future chances of job seekers on the labour market are calculated. The job seekers are classified into three groups based on the forecast of their "integration chances", to which different resources for further education are allocated. The algorithmic system looks for connections between job seeker characteristics and successful employment. The characteristics include age, group of countries, gender, education, care obligations and health impairments as well as past employment, contacts with the AMS and the labour market situation in the place of residence. The aim is to invest primarily in those jobseekers for whom the support measures are most likely to lead to reintegration into the labour market.

Research questions

The above considerations are venture points for our analysis of the AMS algorithm from a socio-technical perspective. We investigate the following questions:

  • What are AMAS' objectives at organisational and operational level? Which guiding principles inform the technical design of the system?
  • How was AMAS technically implemented? Which data on past occupational histories are used and what components does the AMS algorithm consist of?
  • Which forms of bias, discrimination and error rates play a role, and how were these specifically taken into account?
  • What effects can the use of the system have on current practice of the employment service and on job seekers?
  • What recommendations regarding bias and discrimination, transparency and political steering processes and governance institutions does this analysis entail? Or: are derived from it?


The project investigated socio-technical dimensions of the AMS-algorithm based on current research, specifically from the Critical Data Studies and the area of Fairness, Accountability and Transparency in Sociotechnical Systems. By analysing internal and publicly published documents of the AMS, the conceptual, technical and social implementation of this system was investigated. In addition, comparative studies on similar systems in other countries were used.

Key results

The computer-generated “chance of integration” (IC value) was only intended to provide the AMS with an additional function in the care of jobseekers. However, as this study shows, the AMS-algorithm has far-reaching consequences for job seekers, AMS staff and the AMS organisation:

  1. an increase in the efficiency of the counselling process is only associated with a predominantly routine adoption of the computer-generated classification. This is in line with the general trend towards rationalising services, but is not conducive to the service orientation of a semi-governmental institution that fulfils its public mandate to increase job seekers' chances.
  2. The increase in "training effectiveness" by concentrating funding in the middle segment is not primarily aimed at the accuracy and quality of the measure for an individual AMS customer, but combines the objective of "effective" use of funds with a rough division into three customer groups.
  3. In the development of the system, hardly any procedures were used to avoid bias in the system and the system does not offer any indications in its application to prevent possible structural inequalities in treatment. Gender equality aspects have traditionally played an important role in the AMS support process, which can run counter to an objectified, numerical (and not bias-optimised) classification of integration opportunities.


The development of algorithmic systems for (semi-)governmental institutions such as the AMS, requires anti-discrimination measures, as well as system and data transparency in order to enable a comprehensive evaluation from a technical, fundamental rights, democratic and rule-of-law perspective.

In addition, the study recommends rights of access and objection for those affected, public consultations, and the teaching of new skills for AMS advisors and customers in case that algorithmic systems are used in the public sector.


The project was carried out in interdisciplinary cooperation with Florian Cech and Fabian Fischer (Centre for Informatics & Society, TU Vienna) and Gabriel Grill (PhD student at the School of Information, University of Michigan).