«Supporting sponsors Major sponsors The Australian Human Rights Commission encourages the dissemination and exchange of information provided in this ...»
The Australian Broadcasting Corporation (ABC) crowds out free speech The ABC quells speech. Commercial media outlets compete for the attention of Australians, but much attention is diverted to the ABC’s advertising-free offerings. Commercial media, as a consequence, have a smaller and less diverse voice than would otherwise be the case.
Because the ABC is government owned, its speech is widely perceived to be less biased and more authoritative. The perceived authority of the ABC gives its staff considerable power over the content and the limits of the speech people hear in Australia. Our livelihoods are harmed as a result of the diminishment of alternative voices.
Conclusions and action steps There is no clear basis for distinguishing between commercial and political speech. Both political decisions and commercial transactions affect people’s livelihoods. As for importance, most people are more pressingly concerned about matters that directly affect their incomes and expenditures than political debates about, for example, border control.
In sum, the freedom of commercial speech is just as deserving of protection as that of any other speech.
Economic theory and experimental evidence both reject speech restrictions because they reduce welfare.
Unfortunately, the weight of reason and evidence is often rejected with the unsupported claim that ‘things are different this time’.
With that obstacle in mind, I propose that any speech restriction should be discarded, unless evidence, from scientific experiments and forecasting, of a clear net benefit is produced.
3 Free speech in the digital age
3.1 Professor Julian Thomasviii Director, Swinburne Institute for Social Research Topic: Free speech and the internet I’d like to thank the Commission for the opportunity to participate in this important and timely conversation.
The theme of my remarks is that we know the internet makes a huge contribution to freedom of expression and political debate in this country. We know that contribution is also extraordinarily conflicted and contested. But we also need to know a lot more about it than we do.
Commissioner Wilson has talked about the problem of complacency in Australia around free speech.
If we look at the ingredients of complacency, one of those is a lack of knowledge. And a lack of knowledge, as I’m going to illustrate briefly, can also have other undesirable consequences.
Thirty years ago, the U.S. political scientist Ithiel de Sola Pool famously described communication technologies as ‘technologies of freedom’.74 His influential argument was that if we were able to free technology from the dead hand of government control, then the electronic communications of the day – he was writing about video disks and cable TV – had the potential to open up politics, culture and enterprise in unprecedented ways.
The idea of the internet as a liberal machine continues to powerfully shape our expectations of what it can do or should do. But that image of technologies of freedom has been complicated dramatically by recent and very hard experience. Liberal uncertainties have replaced liberal confidence.
First of all, we’ve discovered that the technologies of freedom can also be technologies of unfreedom. We know the internet can be used as a system for control and suppression. Our free communication can become an information resource for others, for purposes we know not. That level of surveillance has taken many of us as naïve, trusting users of the internet by surprise.
In Australia, scholars like Mark Andrejevic, formerly at the University of Queensland, have begun tracing the trajectories of what they call sensor societies, where we the users are better understood as elements in information-gathering systems rather than Sola Pool-type subjects with the power to manage our own communication systems.
The second thing that has happened is that the social distribution and adaptation of the internet over the last couple of decades has been extraordinarily rapid. In the last decade, we have gone from a low speed, dial-up information and text communications system to mobile broadband, and the internet of images and video. Meanwhile, Schumpeterian disruptions have become endemic in industries that are important for freedom of expression, especially journalism, with outcomes that remain far from settled.
Our knowledge of the net — together with our policy assumptions about it — have not kept up with the pace of change. When we began seriously researching Australians’ uses of the internet in the early 2000s, sample surveys were our main research tools. We asked users how many times a day they checked their email; we asked how many times they logged onto the internet. Now those questions are meaningless, and the telephone surveys that we used to ask them are also becoming obsolete.
viii Professor Julian Thomas is Director of the Swinburne Institute for Social Research and a Professor of Media and Communications at Swinburne University of Technology, Melbourne. His research interests are in new media, information policy and the history of communications technologies.
Free Speech 2014 • Symposium papers • 25 3 Free speech in the digital age
A further complication from the recent history is that internet policy itself is a space now occupied by an increasing range of players. Some of them are quite new, some of them are old, but the players in the ‘ecology of games’ that comprises internet policy now include multiple government agencies and regulators, NGOs, community groups, international bodies, internet service providers, telcos, interest groups of all kinds, other kinds of technical and business intermediaries.75 They all have different objectives and policy goals. They deal with common subject matter and the same basic problem — how and to what degree should we regulate the internet — but they are players in separate, albeit parallel, policy arguments, defined by issues such as digital rights, hate speech and child protection.
Lastly, the liberal vision of technologies of freedom has been complicated by the new kinds of public space being produced by the internet, which shape political discussion in ways we have only just begun to understand. These spaces provide the locations for the parallel policy games. Take Twitter as a simple example – a platform that didn’t exist ten years ago is clearly a powerful element in contemporary Australian public political discourse, but for now it is a medium we don’t know much about, despite its openness to exciting but so far undeveloped new research methods.
Twitter started in July 2006, and now has something like 700 million current accounts worldwide. We can estimate – with the help of colleagues such as Axel Bruns and Jean Burgess at Queensland University of Technology, who are pioneering scholars in this field – that in late 2014 there might be between 2.5 and 3 million Australian Twitter account holders, generally well-educated, mainly located in the south-eastern Sydney, Melbourne and Canberra triangle, more urban than not. There are slightly more male than female Twitter users. These are the basics — we need to go much further before moving into a regulatory mode with these new technologies.76 A well-known cultural critic some time ago observed that every successful new technology creates new kinds of accidents. You can see these accidents very often on Twitter. Email, which barely still qualifies as new, continues to produce them. Regulatory accidents arising from new technologies are generally more serious. Let me mention one example, which I hope demonstrates the need for more knowledge about the internet and, axiomatically, regulatory forbearance in the absence of knowledge. The example I have in mind is the Commonwealth’s 2007 Northern Territory ‘Intervention’, or ‘National Emergency Response’, enabled by the suspension of the Racial Discrimination Act 1975 (Cth).
One element of the intervention that has received little attention was the requirement for auditing of what were called ‘publicly funded computers’ in prescribed areas. Those rules imposed a set of obligations on the users and administrators of those computers, which applied to no other Australians.
The provisions in question went well beyond the concerns about pornography articulated in the Little Children are Sacred report.77 The Northern Territory intervention targeted a range of remarkably broadly defined speech acts including harassment, vilification, defamation, annoying people, abusing people, being offensive, being obscene – all of those things were proscribed.
My colleagues and I at Swinburne’s Institute for Social Research have recently investigated the administration of this aspect of the Intervention and its consequences. Our study will be published shortly, but I can summarise our findings here.78 It has been difficult to escape the conclusion that the auditing provisions and the system intended to enact them was not well designed, well communicated or well understood. Several key points can be briefly
• The Intervention’s auditing regulations had a very broad scope. The category of ‘publicly funded computers’ was never clearly defined. It appears that, for the purposes of the regulation, almost any computer in a prescribed area may have been regarded as a public computer.
• The funds devoted to implementing this aspect of the Intervention could have been invested in more productive ways, building digital capability and literacy in the communities concerned.
• Remote Aboriginal communities have some of the lowest rates of access to internet in Australia. The auditing regulations exacerbated Australia’s digital divide by imposing additional costs and risks for those extending digital inclusion in those communities.
• Lastly, the changes to freedom of information were only available in computers that were located in government offices with shared facilities. It was the digital divide between indigenous Australians and other Australians that made the policing of that particular policy possible.
The Intervention has left a complex legacy in the Northern Territory. In the domain of the internet, there can be little doubt that it prejudiced the freedom of some of Australia’s most disadvantaged citizens, people with very few of the communication rights and capabilities that we take for granted in Australia’s cities.
What lessons should we draw from that case? Australians have little in the way of constitutional protection for freedom of expression. There is therefore no cause for complacency, and every reason for regulatory forbearance in matters where that freedom is at issue. Where we believe we need it, regulation should focus on specific risks rather than broadly defined ones. It should avoid large, symbolic targets. Finally, its effectiveness should be tested and weighed against alternative forms of intervention and management.
My team at Facebook is responsible for managing the standards for how people can use the product. And that means, especially relevant to today’s conversation, what people can share and what people can post when they are on Facebook.
This is truly a global task because Facebook’s community is increasingly a global community, with people all over the world engaging within their own countries and with one another.
We currently have over 1.3 billion people regularly using the product. The vast majority are outside the United States and Canada. So this company may have started in the U.S., but make no mistake about it, it is definitely a global company and a global community.
Our policies and the way we think about speech on Facebook have to reflect that. And that is one of the reasons that my team, and the teams who are crafting and enforcing or applying the policies, are global in terms of their location and reviewing content in different languages, but at the same time, we have to have one set of policies.
That is in part because of the way that the product works. One person in Australia writes something, somebody in Germany comments on it, a person in Canada likes that comment. So that is something people may not realise about Facebook.
The challenge for my team is how to write a set of policies that can be applied globally, taking into account the many different cultures, backgrounds and legal landscapes of the countries where the people that use Facebook live.
We try to do it through a three-pronged approach. The first prong is transparently telling people what we expect when they use Facebook. That is our community standard, which I’ll talk a little bit about in a second.
The second prong is giving people the tools they need to control their experience. Because there may be a piece of content that is upsetting to somebody that does not violate our principles and standards, but we want to make sure that the person has the ability to control their experience.
The third prong is giving people the tools to resolve disputes among themselves and to speak up productively and positively against speech that they find offensive.
The first prong is a community standard. You can see a screenshot here. If you go to our site, the link is at the bottom, you can see the standards in more detail. But these lay out the basic areas or issues that our standards govern.
You note that we have a section on bullying and harassment, which we do not welcome on our platform. Some of these are what we would call no-brainers. Nobody wants child exploitation imagery on the site. I would think that we all agree that we do not want that.
There are some other areas where people think that they may be ok or they may not be. That is where global policy comes in.
Here is what we think about it. Facebook’s mission is to help people connect and share, and they are only going to do that if they feel that the platform is a safe place to be. So our number one priority is making sure that people are safe on the site.
At the same time, we want to make sure that people have the freedom to engage in debate and discourse, to share and connect in real, meaningful ways and to raise the awareness of society about issues that are important to them.