Many of us would like their Twitter, Facebook and YouTube accounts not to be harassed and not associated with compromising content. Likewise, we all want to have full freedom of expression.
Social media allow us to connect with many people. This entails clear advantages, but as many risks, sometimes underestimated. For this reason, a filter is necessary. Unfortunately, however, although content moderation is a custom for all social media, this practice is constantly hidden and its implementation methods are secreted.
All platforms are subject to moderation
Platforms must, in one way or another, moderate to protect each user from the network, to protect a group of users from their antagonists, but also to remove what is offensive, abject or illegal. The challenge lies in determining exactly when, how and why to intervene.
Moderating content is difficult, because the reports are numerous and continuous, because the rules to be applied are often not clear and because a simple mistake can generate a public scandal that obscures the millions of invisible successes. And mistakes can happen because for every rule there can be room for an exception: everything that is not markedly illegal can be defended and become acceptable.
But who sets the rules? The rules are imposed by the company that owns the platform and shared with users. Many of these rules borrow American norms and customs, but they must be adapted to the countries and companies that can access the platform. Moderation can be used to adapt the nature of the platform to the external constraints imposed.
No form of moderation can be said to be impartial, because it is always the result of a choice dictated by non-negotiable rules.
Our data: the price to pay to access social networks
The laws in force in the United States provide for specific penalties for anyone who circulates on the net obscene or socially unacceptable material.
However, the same legislation does not provide for any liability attributable to the media used for the dissemination of these materials. Social media, in fact, have the function of a means of communication and regulate independently, that is, they can claim the right to remove users or delete content, without having to answer for any of the actions they perform.
The birth of the web is linked to the dream of having a space in which freedom, neutrality, meritocracy could be the dominant qualities to be able to build a widespread and shared culture and to create a democratic information society.
Reality does not exactly correspond to the dream. Access to the network is the result of choices, as is access to a platform. We users are given the opportunity to choose freely and freely. The price to pay is represented by the need to share your personal data and by the fact of accepting to become the target of targeted promotional actions, sewn on our person.
The platform, therefore, is not a neutral space and is a hybrid entity: it is not a channel and not even a content, it is not just a network or just a medium.
The guidelines for managing a virtual community
Almost all of us subscribe to a platform without reading the conditions of use. The community guidelines represent its articles of association, the translation into simple words of the Terms of Service, that is the legal contract signed at the time of registration.
The contract that a platform offers us emphasizes the idea of being able to join an extended community while maintaining its freedom of expression, while remaining responsible for whatever you say, write or look at.
However, the guidelines are nothing more than a list of rules to be respected in a precise and mechanical way, prohibitions expressed as means to satisfy personal and collective interest. We are asked to access the community with the intention of giving our best and promoting the circulation of content that can contribute positively to improving the community.
In reality, there are no definitive, clear, incontrovertible guidelines: they can be adapted to the realities of the socio-political context of the country in which the platform is active or modified by the company that owns the social network for its own needs.
Three imperfect solutions to control an immense number of contents
What characterizes current social media is the massive amount of subscribed users and content circulating on the platforms, an unstoppable flow. Managing such a data flow is a complex undertaking.
The ideal condition would consist in the possibility of carrying out a preventive check on the contents put into circulation, but normally the “public and then filter” policy is adopted by resorting to different methods according to the cases.
- Editorial review : the platform is exclusively responsible for reviewing the contents. It is the perfect form of moderation, especially if you manage to implement it in advance, like Apple does. This is a careful and timely check carried out before publication.
- Flagging : the platform requests the reporting of dubious content to the user community. This allows you to distribute the workload over many more people, clarify the rules of use of the platform precisely and empower users. It is a follow-up service for reports received.
- Use of automated techniques : these are not easy to use and limited efficacy software, because they do not allow to contextualize the messages before their interpretation and possible elimination. False positives and false negatives are not always verifiable.
They are all imperfect methods, but widely tested and adopted only in ideal situations. They are an integral part of digital culture and shape the policies adopted by the platforms and, more generally, by the network.