Documentary exposes the threat of facial recognition surveillance in Serbia · Global Voices

Documentary exposes the threat of facial recognition surveillance in Serbia · Global Voices

Surveillance “really challenges your sense of dignity…”

One of the thousands of cameras conducting video surveillance in the streets of Belgrade, Serbia’s capital. Photo by SHARE Foundation, used with permission.

This story is based on reporting by Global Voices’ content partner Meta.mk News Agency, a project of Metamorphosis Foundation

A civic initiative called #hiljadekamera (‘thousands of cameras’ in English), has been raising concerns about the deterioration of privacy in Serbia resulting from the introduction of video surveillance system with advanced facial recognition in the capital Belgrade. As part of the campaign, a documentary with the same title, was released.

The Serbian government in cooperation with the Chinese technology company Huawei have been actively working on the implementation of the surveillance project, called Safe City, in Belgrade since 2019. This project involves the installation of thousands of smart surveillance cameras with object and facial recognition features. The cameras were procured as part of a bundle that included an artificial intelligence system used to analyse the feed captured by them.

Hiljade Kamera is led by SHARE Foundation, the leading Serbian digital rights group established in 2012 and a member of European Digital Rights network (EDRI). On its website (hiljade.kamera.rs), launched in May 2020, the initiative describes itself as ”a community of individuals and organisations that advocate the responsible use of surveillance technology.” It is pushing for the respect of the right to privacy and accountability in relation to the government surveillance program through a number of tactics, including crowd-mapping, community building, research, advocacy and content production.

SHARE Foundation produced a concise documentary summarizing the situation. The documentary is available in Serbian language, with English subtitles.

In the video, experts and representatives of the initiative and the Serbian National Data Protection Authority, raised concerns about the surveillance project.

Bojan Perkov, a policy researcher at SHARE Foundation noted in an article published on May 19, that the governments of Serbia and China have been working on “technical and economic cooperation” since 2009, when they signed their first bilateral agreement. Several years later, a strategic partnership forged between Serbia’s Ministry of Interior and Huawei, paving the way to the implementation of the project “Safe Society in Serbia”. Over the past several months, new cameras have been widely installed throughout Belgrade.

Perkov further questioned the legality of the program’s implementation:

Even though the Ministry was obliged by law to conduct a Data Protection Impact Assessment (DPIA) of the new smart surveillance system, it failed to fulfil the legal requirements, as warned by civil society organisations and the Commissioner for Personal Data Protection.

The threats of biometric surveillance

The  documentary includes a contribution by Ella Jakubowska from European Digital Rights (EDRi), the leading network fighting for digital rights in Europe, who stressed the risks emanating from massive surveillance:

There’s a real sense of empowerment from being able to express yourself differently and suddenly, if you’re forced to conform, this composes a real threat to your identity. It really challenges your sense of dignity and who you are as a person and who you’re allowed to be in your society in a way that’s very dangerous.

This segment is part of an extensive interview conducted by SHARE Foundation which provides wider context of the threats of biometric mass surveillance to human rights and freedoms.

In surveillance cameras equipped with facial recognition software, facial features are captured and analysed to identify an individual by matching the data on existing databases.

Ms Jakubowska noted:

Any society that looks to stratify people based on how they look, based on their health, based on their data and things about them, is an incredibly authoritarian and sinister society. The societies throughout history that have tried to separate and stratify people based on data about them are the sort of authoritarian societies that we want to stay as far away as possible from…

EDRI representative stressed that “the people need to hold those in power to account, to be calling out surveillance when they see it and contributing to civil society organisations and the activists that are trying to reveal these secretive rollouts.” Collaboration of all stakeholders and demand for public debate are key to preventing situations in which the power to decide is taken from citizens, and lies only in the hands of private companies or police forces, she added.