8 ways to understand the mechanisms that regulate social platforms

Many of us would like their Twitter, Facebook and YouTube accounts not to be harassed and not associated with compromising content. Likewise, we all want to have full freedom of expression.

Social media allow us to connect with many people. This entails clear advantages, but as many risks, sometimes underestimated. For this reason, a filter is necessary. Unfortunately, however, although content moderation is a custom for all social media, this practice is constantly hidden and its implementation methods are secreted.

All platforms are subject to moderation

Platforms must, in one way or another, moderate to protect each user from the network, to protect a group of users from their antagonists, but also to remove what is offensive, abject or illegal. The challenge lies in determining exactly when, how and why to intervene.

Moderating content is difficult, because the reports are numerous and continuous, because the rules to be applied are often not clear and because a simple mistake can generate a public scandal that obscures the millions of invisible successes. And mistakes can happen because for every rule there can be room for an exception: everything that is not markedly illegal can be defended and become acceptable.

But who sets the rules? The rules are imposed by the company that owns the platform and shared with users. Many of these rules borrow American norms and customs, but they must be adapted to the countries and companies that can access the platform. Moderation can be used to adapt the nature of the platform to the external constraints imposed.

No form of moderation can be said to be impartial, because it is always the result of a choice dictated by non-negotiable rules.

Our data: the price to pay to access social networks

The laws in force in the United States provide for specific penalties for anyone who circulates on the net obscene or socially unacceptable material.

However, the same legislation does not provide for any liability attributable to the media used for the dissemination of these materials. Social media, in fact, have the function of a means of communication and regulate independently, that is, they can claim the right to remove users or delete content, without having to answer for any of the actions they perform.

The birth of the web is linked to the dream of having a space in which freedom, neutrality, meritocracy could be the dominant qualities to be able to build a widespread and shared culture and to create a democratic information society.

Reality does not exactly correspond to the dream. Access to the network is the result of choices, as is access to a platform. We users are given the opportunity to choose freely and freely. The price to pay is represented by the need to share your personal data and by the fact of accepting to become the target of targeted promotional actions, sewn on our person.

The platform, therefore, is not a neutral space and is a hybrid entity: it is not a channel and not even a content, it is not just a network or just a medium.

The guidelines for managing a virtual community

Almost all of us subscribe to a platform without reading the conditions of use. The community guidelines represent its articles of association, the translation into simple words of the Terms of Service, that is the legal contract signed at the time of registration.

The contract that a platform offers us emphasizes the idea of ​​being able to join an extended community while maintaining its freedom of expression, while remaining responsible for whatever you say, write or look at.

However, the guidelines are nothing more than a list of rules to be respected in a precise and mechanical way, prohibitions expressed as means to satisfy personal and collective interest. We are asked to access the community with the intention of giving our best and promoting the circulation of content that can contribute positively to improving the community.

In reality, there are no definitive, clear, incontrovertible guidelines: they can be adapted to the realities of the socio-political context of the country in which the platform is active or modified by the company that owns the social network for its own needs.

Three imperfect solutions to control an immense number of contents

What characterizes current social media is the massive amount of subscribed users and content circulating on the platforms, an unstoppable flow. Managing such a data flow is a complex undertaking.

The ideal condition would consist in the possibility of carrying out a preventive check on the contents put into circulation, but normally the “public and then filter” policy is adopted by resorting to different methods according to the cases.

  1. Editorial review : the platform is exclusively responsible for reviewing the contents. It is the perfect form of moderation, especially if you manage to implement it in advance, like Apple does. This is a careful and timely check carried out before publication.
  2. Flagging : the platform requests the reporting of dubious content to the user community. This allows you to distribute the workload over many more people, clarify the rules of use of the platform precisely and empower users. It is a follow-up service for reports received.
  3. Use of automated techniques : these are not easy to use and limited efficacy software, because they do not allow to contextualize the messages before their interpretation and possible elimination. False positives and false negatives are not always verifiable.

They are all imperfect methods, but widely tested and adopted only in ideal situations. They are an integral part of digital culture and shape the policies adopted by the platforms and, more generally, by the network.

People involved in moderation activities

Behind each platform is the review work done by numerous people. The need to review a huge amount of data implies the obligation to draw on human resources external to the company that owns the platform.

Most of the audit work is hidden: a part is carried out in the headquarters of the owner company, but the largest part is outsourced to service companies located in very distant countries, both in spatial and cultural terms .

Since the activity is distributed among different subjects, it is not easily verifiable and is characterized by particularly stressful working conditions. It is very easy for errors, distortions, misunderstandings to occur.

The subjects that can be involved in the moderation process are:

  1. employees : they are at the peak of the review process, they are usually young professionals, highly experienced in the sector;
  2. crowdworker  (large mass of workers hired by a foreign company to operate on a platform): they are a real army of workers often subjected to frenetic rhythms, as well as underpaid. Their work is psychologically proven as they filter out all the ugliness that circulates on the net;
  3. community managers : they are moderators who have the power to supervise the review activity carried out by subgroups of users and who can intervene in the resolution of technical problems or in resolving disputes between various subjects involved;
  4. flaggers :  simple users who are given the opportunity to report anomalies and abuses with respect to the content circulated on a certain platform. It is a voluntary and, in some ways, specialized contribution, because the users in question know the cultural context. Their judgment, however, may not be impartial;
  5. experts : they can position themselves at different levels to help resolve any disputes.

Managing such a diverse and large group of people involves some risks:

  1. subjective interpretation of the rules imposed;
  2. difficulties in coordinating and managing reports.

Moderation as a space for political dispute

The greater or lesser visibility that a group with political or social interests can obtain online is a fundamental element for its success and its recognition.

If the moderation activity favors or penalizes a user or a group of users, it can give rise to appeals motivated by the need not to lose one’s position or diminish one’s image on the net.

Platforms’ decision to delete or filter content

When content is inappropriate for a limited number of users, it can be deleted or hidden. As a rule, the tendency is to eliminate the content, because it is considered a safer action to protect the whole public, even if in contrast with the principle of freedom of expression that characterizes the network.

Some platforms, however, prefer to limit the area of ​​visibility of the reported content, that is, apply moderation by design (strategic moderation) using software that guarantees data filtering and distribution only to authorized users.

This is an automatic process following which the user could:

  • search for the content in question and not get any results displayed;
  • search for the content in question and obtain a false result, which does not include what you wanted to hide.

The restriction or elimination of content can be declared by the platform (as Twitter often does), but there is no constraint in this regard. Restrictions and cancellations can also be the consequence of specific requests made by the governments of the less open states regarding freedom of expression.

These procedures can lead to the coexistence of multiple similar platforms for the same social network: for example Facebook is not a unique entity, but we can say that there are a multitude of Facebook, depending on the content that is made available for each user.

The set of restrictive conditions applied defines the resources we can access and those that are barred from us, defines all the activities that can be carried out on the platform and, last but not least, can also condition our judgments regarding a given issue with possible political and social consequences.

What the platforms are and what they should be

As users it is important to understand the ways in which content moderation is carried out on the platforms, who performs this task and for what purposes. For this purpose,

  1. content moderation should be much more transparent, while remaining discreet;
  2. we should be able to choose independently what to see;
  3. we should be able to use the same profile to access multiple platforms;
  4. users who do not respect the rules should be excluded from the platform;
  5. platforms should adapt to our interests and needs, not condition them.

However, our opinion on the moderation activity must change: it does not represent an occasional event, but it is a constant, essential and characteristic element of each platform. Moderation is the essence of a platform, it is the service it offers us.

Similarly, those who manage the platforms can no longer deny the responsibility of being the custodian of data relating to such a large, heterogeneous and contested public community, an environment created by the social media themselves. Therefore, since the platforms are responsible towards us users, they too must declare and share the means and rules that they use to ensure network security.

Having made all these premises, we can say that when we manage to find our space within the network, then we can say that we have learned to live the network, to use the right tools to express ourselves, to involve and persuade.

Lascia un commento