The terrorist attacks of January in France lead to a
great demonstration in Paris celebrating freedom of speech. Ironically, instead
of creating new rights, the antiterrorist law resulting from these events could
arm investigation journalism and citizen right to privacy.
The
new intelligence bill and the right of privacy protection
After the Charlie Hebdo and Hypecasher events, the
shock was very important in France. A new law concerning surveillance was
announced by the Prime Minister Manuel Valls a few weeks after the attack. This
new intelligence bill is currently examined by the National Assembly (one of
the chambers of the French Parliament). It prescribes a lot of measures which
increase the powers of secret services, and it raises a tense debate in French
society.
The most criticized measure is the use of “black boxes”,
which will be installed at French telecom operators. These systems are algorithms,
which reveal automatically people having a suspicious behavior on the internet,
such as visiting or participate on jihadist websites, having contacts with
presumed terrorists, etc. Once these people are identified, secret services would
be able to access their private information. This law project is very
unwelcomed, to say the least. Some French host companies are opposed to the bill
and threaten to leave the country if it is adopted. Even Charlie Hebdo, the newspaper
used by the government to justify such a restrictive measure, strongly criticized
this use of “black boxes”. The project is not only perceived as inefficient,
but also as intrusive in citizens’ private life. Anyone could be suspected and
watched by secret services, even those who study terrorism in order to fight
it.
The paradox of the post-Charlie
France: dialogue between opponents and partisans of a more watched environment.
Indeed, algorithms are not very clever. These systems
are used to have a look on a huge content of information. An algorithm acts
mechanically and cannot replace human expertise. It won’t make any difference
between an actual potential terrorist and a journalist investigating on these
issues. If journalists are watched, it threatens the secret of their sources and
makes their work more complicated. More generally, this could be a threat for citizens
who maybe were just searching for deep information, who will have their right
to protection of private life and private information restricted.
Social
networks and the protection of freedom of speech
Algorithms are often used in internet regulation; it
is thus already possible to see what their limits are and foresee the problems
that could face secret services in their struggle against terrorism through
algorithms. Social networks use algorithms to control huge amount of content,
which is used, produced and distributed by users on these platforms. Social
networks are often criticized for blocking content mechanically, and it arms sometimes
freedom of speech. Facebook is known to block nudes, which sometimes leads to suppression
of art pictures (Facebook will go to court after blocking the account of a French
professor who posted the picture “The origin of the world” by Courbet) and even
breast cancer campaign, because algorithms automatically recognized it as pornography.
It is also difficult to suppress flags of IS thanks to these systems, because
the risk is to also block content such as news, articles, or people posts
denouncing the organization for example. That’s why it is also impossible to
use algorithms to block offensive hashtags, because reused by people who denounce
the ones who used them in the first place, as it was the case with
#jesuiskouachi, tweeted more by opponents to the attack than actual supporters
of the Kouachi brothers, the authors of the Charlie Hebdo massacre. We see here
that algorithms have their drawbacks and cannot be as effective as case by case
examination.
”The origin of the world” by the realist painter Courbet was censored by
Facebook.
The biggest issue of internet regulation is here. Is
has become very difficult to control information online because of the
evolution of new media. Now, as Axel Bruns stresses, internet users become also
producers of content, which leads to a huge amount of different information and
to blurred lines between amateurs and specialists, or journalists. It is
particularly the case on social networks, were user produce, use and share
contents. Algorithms are used to regulate this huge amount of information. This
is a big change compared to what was regulation of content in traditional
media. The specific context in which information could be consumed and the
knowledge of who constitute audiences lead to regulation according to a
specific context, namely if the information would be consumed in private or
not, if children are susceptible to watch, etc. This context now changed
completely, because it is possible to have access to content on different
platforms, an aspect of the convergence of media phenomena explained by Henry Jenkins,
and possible to access that content anytime thanks to internet. Moreover,
newspapers, radio stations or TV channels take time to justify the content they
publish, and explain their choice according to ethics of journalism. Social
networks refer to their guidelines but won’t always justify why they suppress some
content or not. Due to professional secret, we don’t know how their algorithms
works, which content they target according to which ethics and moral
principles.
The French media specialist Olivier Ertzscheid
explains that social networks’ guidelines work as the only law online and denounces the
obscurity around algorithms used to make this rules respected. Who finally
decides the content targeted? It is difficult to imagine that so much power on
information received by more and more people around the globe can be decided by
some unknown actors according to obscure moral principles. This could possibly
arm freedom of speech, as Instagram illustrated recently by censoring a picture
of a girl having period blood on her clothes and bed. This picture belonged to
the project of the artist Rupi Kaur whose aim was to demystify periods and
denounce the taboo around it, and was finally allowed by the social network
which recognized its “mistake”.
The algorithms raise the questions of the issue of
freedom of speech concerning social networks and the issue of privacy concerning
French new intelligence bill. The extension of information we are exposed to
makes it difficult for public powers to find efficient ways to fight terrorism
or, for example, pedophilia, without any restriction of fundamental freedoms.
In France, the system of algorithms used by social networks to spot
unappropriated content would be used by secret services to spot “unappropriated
behavior” through French operators. The limits would be the same, as Facebook
interprets breast cancer campaign as pornography, these algorithms would be likely
to commit mistakes too. And French secret services actually don't need more targeted
surveillance to commit mistakes: before the Charlie Hebdo attack, secret
services confused the address of one of the Kouachi brothers with the one of an
homonym who was…a 81 year-old man.
No comments:
Post a Comment