If you’re in the business of publishing or storytelling, maybe you are a news organisation or broadcaster. So let’s cut to the chase: if you publish content on third party platforms you need to check out the NiemanLab post on the latest Australian court ruling. Or read the article ABC wrote about it. But why should you care? Well, because it’s a potential biggy for the rest of the world, at least for publishers. The essence of the court rule stating that a publisher is responsible for anything anybody posts on Facebook, is summarized in the article by Joshua Benton:
“That a news publisher should be held accountable for the journalism it publishes is obvious. That it should be held accountable for reader comments left on its own website (which it fully controls) is, at a minimum, debatable. But that it should be held legally liable for the comments of every rando who visits its Facebook page — in other words, the speech of people it doesn’t control, on a platform it doesn’t control — is a big, big step.”
But it is exactly what the Australian court has decided, based on the case of Dylan Voller, a young man with a serious juvenile track record. When ABC’s investigative TV-show Four Corners aired “Australia’s Shame”, on abuse in prison they also placed some of the content on Facebook. And in the show Voller’s case was discussed.
Benton says: “Voller’s lawyers argued that by posting their content on Facebook, news organizations were therefore engaged in the act of “publishing” all the comments that followed. Publishers, understandably, disagreed and claimed they were at most “innocent disseminators” of the troublesome comments.”
But the court countered the publisher’s standings on this and stated: … a news org having a public Facebook page “has little to do with freedom of speech or the exchange of ideas”, but that instead “the primary purpose of the operation of the public Facebook page is to optimise readership…and to optimise advertising revenue.” The “exchange of ideas” is merely “a mechanism…by which that is achieved.”
So publishers took it to the higher court in order to fight the outcomes. And as of today they lost - well, at least in Australia. So, if you’re reading this from your newsroom in Melbourne, for example, this means you can be held responsible for all the comments, naughty statements, uploaded content and everything else that is defined as unlawful. Which in turn means publishers must now monitor everything that takes place on their social media pages.
Benton, in the NiemanLab article, is very critical and a bit ‘on guard’ that this law might also hit the USA. He’s afraid that every (news) publisher is now doomed to ”pre-moderate every single comment, all the way up to publishing standard, posted on its Facebook page. (And, presumably any other social platform where an outlet posts to an account and other people can respond.)
That seems both unlikely for most outlets and would certainly be an unwise use of a news organisation’s limited resources. And it would mean removing anything resembling a significant truth claim that a social media staffer couldn’t immediately verify as true.”
We’ve thought about this long and hard, and decided that we could help.
Meet our latest notification feature, 'The Trumpet'
With this we can monitor in real time anything that is being commented or posted on your platforms and based on keywords, white lists and many other algorithms alert your newsroom on stories that you should keep an eye on or do webcare or react to. This means that you’d always be able to react within a reasonable time frame to any problematic postings - and rest in the comfort of knowing that our system is controlling any online activity that could be seen as defamatory. And you keep control because you can define what you believe is ‘defamatory’ and based on these boundaries monitoring takes place.
This is how The Trumpet works
Smartocto already collects all the metrics coming from your Facebook: reach, consumption patterns, link clicks and all the metrics relating to engagement. Especially relevant in this instance is the fact that our system contains the reactions, the comment-text and Facebook's negative feedback (the hides, the flagging and the unfriending).
The formula is simple. If we see a lot more than average nasty reactions to a Facebook post, a lot of negative feedback and offensive or illegal wording in the comments (the publisher can maintain a blacklist of words and terms), the Trumpet will send you a message immediately in the inbox that you prefer; e-mail, Whatsapp, Slack, Teams - and we can even connect your own message system, if you have one. So you’ll be alerted immediately to see if immediate webcare or editorial action is needed.
We have built that notification algorithm and a connected and modular dashboard - we call the dashboard Waves - with the positive and especially negative Facebook metrics that you can take a look at. In your phone, in your laptop, on the wall. If you contact us, we can explain that our notifications and dashboards can do a lot more. But at least smartocto Trumpet can make you feel a little bit more at ease. Especially if you are Australian.