It’s time social media companies are recognised as publishers

Social media companies, such as Facebook, are adamant that they are ‘tech platforms’ or ‘technology companies’, rather than publishers.This, they claim, frees them of responsibility for monitoring and editing the content that appears on their hosting website. It also protects them from possible legal action or sanction if something is posted to their platforms that might be defamatory or, indeed, be against the law (inciting racial hatred, for example).

Clearly the task of undertaking such an editing task would be too big, even for a company the size of Facebook. That could signal the end of social media as we know it (“not a bad thing” I hear some of you cry).

There are a number of compelling points that can be highlighted that build a strong case for redefining social media companies as publishers and, with it, the legal responsibility for everything that appears on their platform.

They own the domain

Every company that owns and controls a website – other than social media companies – is 100% responsible for the content that is posted to that website. If they allow other people to post things up, they are still responsible for the content. If it defames someone, breaks laws, such as contempt of court, or acts as a front for a scam operator, the owner of the website would be responsible (unless there are mitigating factors, such as their website has been hacked). Social media companies host their own domains and websites and allow people to post content to these websites. As yet, they are immune to any repercussions from the content breaching civil or criminal laws.

Social media companies already act as censors

However, it’s not the case that social media companies don’t already act as censors. They can and do remove content (not necessarily consistently) if they deem it contravenes their terms of service, whether that is because it infringes someone’s copyright, or it’s considered too offensive, distasteful or otherwise breaches their ‘terms of use’ or ‘community standards’ – for example, Facebook ‘restricts the display of nudity’ or ‘content that glorifies violence’ on its site.

In doing this there is tacit acknowledgement that some content should be prohibited. At the moment the what and when is left solely to the social media companies themselves.

They are using the content to generate advertising revenue

Social media companies, in the main, ‘monetise’ their operations by selling advertising space. It’s a fairly safe bet to say that not many people will use Facebook, Twitter, Instagram or YouTube simply to watch the adverts that appear – for most people, having to ‘endure’ the adverts is the price you pay for using the site to read your friends’ post, watch funny videos or broadcast to the world what a great time you’re having.

The social media companies are using the content – including yours – and people’s interactions with the site to put advertising in front of them. Some of this content will be content that the general public or wider society may judge to be on the wrong side of what’s acceptable. And yet social media companies are profiting from it.

Motivation to change

Social media companies understand they need to exert some level of editorship over the content that appears on their sites. However, they are either unwilling or unable to put in place systems that regulate all of the material that appears. Undoubtedly, it would affect their profitability were they to do so.

Over the last few days there has been a great deal of discussion about the suicide of a young girl who had viewed images of self-harm on Instagram, and the role that this may have played. Facebook (the owner of Instagram), has pledged to do “whatever it takes” to make its social media platform safer for young people.

But should it be left up to companies like Facebook to decide what should be done and how far it should go?

If social media companies were recognised as publishers they would have to abide by the rules and laws that apply to all publishers.

There will be the issue of different laws being applicable to different legal jurisdictions (where has the content actually been published?) That’s a bigger problem that applies to the whole Internet. But it shouldn’t prevent responsible governments from doing the right thing.

Doing the right thing

At the very least it sends out the right message. And if the potential threat of legal action were a stark reality for the social media ‘publishers’ (as they would be known), it is highly likely they would collectively be a great deal stricter as to who and what they let loose on their platforms.

Maybe there are too many vested interests in keeping things the way they are. For social media companies, certainly, but perhaps also for advertisers on these platforms, who have found an easy way to accurately target their prospective customers.

Because of the speed at which the popularity of social media (and the Internet) has grown, legislators either haven’t had it on their radar to decide where it belongs or don’t quite know how to bring it into line.

But it really is a matter of principle. Any civilised society agrees a set of values and rules that everyone must abide by or else face the consequences of their actions.

At the moment social media companies are not playing their part in the Internet society, while benefitting hugely from it. They have demonstrated that they can’t or won’t properly control content posted to their sites, using the ‘it’s not us, it’s them’ excuse. Recognising them in law as publishers would take this excuse away.

Categories: Opinion