Skip to content

Ben jij BIJ1 ?

Take action and participate
for radical equality.


Digital rights and technology


Luister naar dit hoofdstuk ingesproken door Kelly Mostert

BIJ1 fights for online and offline privacy, as well as for the freedom to move around the public space without recording and storing every movement. We see the internet as a public space, which should be designed to serve public rather than commercial interests. Digital means of communication must be accessible and secure for everyone, and our data should be welll protected. Above all, the digitisation of public services must be carried out in a fair way, banning discrimination and ethnic profiling. In this way, we are working towards a Netherlands where digital technologies contribute to equal opportunities and social inclusion.


Digital technology is everywhere: in our pocket, our house and our street. This provides convenience and benefits, but also comes with a risk. Our online and offline behaviour is monitored by companies and government bodies. Our data is stored, processed and used to make predictions about loans, benefits and crime. Who we are and what we do sometimes becomes visible publicly because government agencies or companies do not protect our data well enough. This is unacceptable. In addition, our data is used for commercial purposes without our consent – or we are forced to give consent in order to be able to access Internet services. As a result, companies such as Google and Facebook have become superpowers with huge masses of data at their disposal. On top of this it is quite difficult to hold them accountable when they violate our privacy or human rights. Because these technologies are so new, there is hardly any legislation or policy for them within Europe. It is high time to rein in the power of these tech giants.


Schools, hospitals, the Tax and Customs Administration, the UWV, the police and other government bodies all use digital technologies to store, process and analyse our data by means of algorithms. This can be very useful and important, but there are also huge societal risks associated with the use of algorithms by public authorities. Too often, algorithms are trialled when targeting the most vulnerable groups in society. Why are there big investments in the detection of social welfare fraud by algorithms but not in the detection of white-collar crime in banks and large companies? This is unfair and promotes ethnic profiling and class profiling. Even the technologists who develop new technology and algorithms are not always free of prejudice themselves and can incorporate these, consciously or not, into their algorithms, so that a calculation model itself can be racist or discriminatory.

If the police mainly enter (criminal) data on marginalised areas into algorithms and much less from other neighbourhoods, these algorithms will predict a distorted picture of reality, encourage segregation and further stigmatise marginalised areas. Because of the way algorithms are shared and built upon by others, this process will be repeated over and over again. In this way, discrimination, prejudice and ethnic profiling will be incorporated deeper and deeper into computer models. Computer models on which the government trusts blindly in assessing benefits and allowances, or predicting crime.

Discrimination and ethnic profiling in the digitisation of public services must therefore now be nipped in the bud. The Netherlands does not yet have the means to do this. The Personal Data Protection Authority, which monitors compliance with the General Data Protection Regulation, does not have the mandate or the means to combat and punish (possible) discrimination in algorithms and data processing.

In order to ensure greater digital security and justice, BIJ1 is taking the following measures.


  1. Access to the Internet is a fundamental right and must therefore be accessible to everyone. There will be free Internet access for everyone.
  2. Efforts are being made to further develop educational programmes for all Dutch residents to make them more aware of their digital rights and privacy rights.
  3. Accessibility requirements must be included in digitisation processes, taking into account the experiences of people with disabilities. Alternative channels such as the telephone or ticket office should remain open to people who have difficulties with digital technologies, such as the elderly, the illiterate or people with intellectual disabilities.
  4. The government must facilitate the development of open source and open standards in the public and private sectors.


  1. We are striving for legislation and policy at Dutch and European level to officially define the Internet as a utility rather than a market place dominated by companies.
  2. There will be a ban on the unsolicited sale of personal information.
  3. We are working at Dutch and European level to curb the power of technology giants such as Facebook and Google and to protect citizens from this.
  4. A ban is imposed on facial recognition software and similar software that recognises people on the basis of posture or appearance.
  5. End-to-end encryption remains guaranteed. This is digital letter secrecy: secure communication between two people without a third party having access to it. This is an indispensable protection of the right to privacy and secure communication.
  6. The extension of camera surveillance in the public space will be stopped.


  1. There is an active public policy against discrimination and ethnic profiling in the digitisation of public services – from the design of algorithms to the evaluation of digitisation processes. This is only possible if the people who guide these processes represent the various population groups in the Netherlands.
  2. There will be an independent supervisory body to monitor the fight against discrimination, ethnic profiling and the violation of human rights in the collection of data on citizens by the government. This monitor will have the mandate to request and assess these data and to give legally binding advice to government bodies.
  3. We are establishing strict supervision and a human rights test for the export of surveillance software by Dutch companies. Under no circumstances may Dutch surveillance software contribute to human rights violations abroad.
  4. The Intelligence and Security Services Act (Wiv) should better guarantee human rights. The Dutch intelligence and security services must stop exchanging unparalleled bulk data with all foreign intelligence services.
  5. A digitisation policy that focuses on citizens’ rights and needs, not digitisation per se.

What are you looking for