Digital welfare states: boundaries and opportunities – Social Europe

A Dutch court case has set out a framework within which the emergent digital welfare state can respect the right to privacy.

Public authorities are increasingly using new technologies to perform public services. The latest ideas concern health-care apps to prevent the further spread of the coronavirus. Worldwide, there are many more examples of what the United Nations calls digital welfare states. Although governments argue new technologies make their services more efficient and cost-effective, many however express concern about the surveillance of citizens.

Such controversies tend to attach to individual episodes. Given the widespread emergence of digital welfare states, universal guidelines are needed to explore the opportunities they offer but also their legitimate boundaries. In a first court case, human rights have proved to offer relevant guidance.

Many welfare states have started using big data and algorithms in their social-security provision. Digital welfare states may be defined as having systems of social protection and assistance which are driven by digital data and technologies that are used to automate, predict, identify, surveil, detect, target and punish. For instance, data-driven tools are used to detect social-security fraud. Likewise, some governments use location data to track-and-trace the whereabouts of their citizens, aiming to halt the coronavirus.

"Social Europe publishes thought-provoking articles on the big political and economic issues of our time analysed from a European viewpoint. Indispensable reading!"

Columnist for The Guardian

Thank you very much for your interest! Now please check your email to confirm your subscription.

At first sight, such apps offer quick solutions to governments. Yet hasty decisions hinder proper research and in-depth debates on their effectiveness, necessity and side-effects. A suggestion by the Dutch government to create track-and-trace apps drew a public response from 60 experts. They warned against rapid implementation, urging that the purpose, necessity and effectiveness of such apps be weighed against the fabric of society, including fundamental rights and freedoms.

Quoting Michel Foucault, the experts wrote: Surveillance is permanent in its effects, even if it is discontinuous in its action. They expressed fear that the apps would set a precedent for future use of comparably invasive technologies, after the Covid-19 crisis had subsidedand so stressed that any app should be temporary, necessary, proportionate, transparent, completely anonymous, voluntary and managed by an independent body.

Other discussions of new technologies refer to similar principles. Universal guidelines are thus needed to underpin the development and functioning of any new technology used in digital welfare states. The recent judgment by the district court of the Hague shows that international human rights form a proper basis to create such guidelines.

The first ever court case using human rights to assess new technologies in digital welfare states focused on the Dutch System Risk Indication (SyRI). The SyRI lawsuit was taken against the Dutch state by a coalition of non-governmental organisations, supported by the then UN special rapporteur on extreme poverty and human rights, Philip Alston, who wrote an amicus brief to the court.

SyRI was established to detect welfare fraud, collating no less than 17 categories of personal data gathered by different public agencies. These included information on employment, detention, sanctions, finances, education, pensions, childcare allowances, benefit receipt and health insurance. SyRI has been used recurrently, especially in neighbourhoods with poorer and more vulnerable people. It has analysed data using an algorithm with risk indicators, thus selecting potentially fraudulent claimants. The algorithm and its indicators were kept secret out of fear citizens would start gaming the system.

The court ruled that SyRI violated important human rights and therefore should be ended immediately. For the UN this was nothing less than a landmark rulingfor the first time arresting, on grounds of human rights, the use of digital technologies and abundant information-processing by welfare authorities. It set an important legal precedent and could inspire NGOs across the globe to influence the public debate or even to go to court themselves.

As you may know, Social Europe is an independent publisher. We aren't backed by a large publishing house, big advertising partners or a multi-million euro enterprise. For the longevity of Social Europe we depend on our loyal readers - we depend on you.

The court stressed the right to respect for private and family life, home and correspondence in article 8 of the European Convention on Human Rights, and paid special attention to achieving a fair balance between the collective importance for society to fight fraud and thereby limiting the individual right to respect for private life. The state had a special responsibility to safeguard this fair balance when using new technologies, the court said.

SyRIs lack of transparency about its functioning prevented scrutiny of whether there was such a balance. It could even result in unfair judgments involving discriminatory distinctions between people, for instance based on socio-economic or migrant status. This might have severe negative consequences, not only for the individuals concerned but also for society at large. Not only fraudsters were caught up in large data processing but, in the case of SyRI, everyone living in a certain neighbourhood and anyone flagged as potentially claiming illegitimately.

The court did not say that the government could never use new technologies. It found fighting fraud a legitimate aim. Equally, however, new technologies sparked questions on the right to protection of personal data. Adequate protection of privacy contributed to trust in government, whereas inadequate protection and too little transparency had the opposite effect: they could make citizens afraid and less willing to share their data. In addition, SyRI did not convince in terms of its necessity and proportionality and the purpose of data-processing.

Here, the court used European Union data-protection regulations to explain the principles of a fair balance between rights and purposes: transparency, purpose limitation and data minimisation. Such principles also appear in the guidelines for contact-tracing apps recently promulgated by the EUs eHealth Network.

All these sources could be used to convert similar messages into universal guidelines for digital welfare states, enabling them to benefit from new technologies in a responsible manner. Then, new technologies could contribute to the economic and social wellbeing of all citizens.

An article on the SyRI court case will appear in the Netherlands Yearbook of International Law, vol 50.

More:
Digital welfare states: boundaries and opportunities - Social Europe

Related Posts

Comments are closed.