«Supporting sponsors Major sponsors The Australian Human Rights Commission encourages the dissemination and exchange of information provided in this ...»
ix Dr Monika Bickert is Facebook’s head of policy management. Her global team writes and interprets policies governing what content people can share on Facebook and how advertisers and developers can interact with the site. Dr Bickert joined Facebook in 2012 as lead security counsel, advising the company on matters including child safety and data security. She previously served as assistant U.S. attorney for the Department of Justice for eleven years, prosecuting federal crimes ranging from public corruption to gang-related violence. Dr Bickert also spent several years serving as resident legal advisor at the U.S. embassy in Bangkok, Thailand.
You can see the pillars on the right and the left. We have safety and free expression. In the middle, we have this area where we want people to engage productively. We want them to be civil and respectful. It is not always going to happen on the site. That is not always the way that people talk to one another, and it is not always the way that awareness is raised or issues are discussed, but there are ways that we can foster civility.
We have found that when people are required to put their real name next to their speech, there is a feeling of accountability and they are more likely to engage in a civil manner.
You can see again the three prongs here. When we craft the standards, we communicate through our public facing site and our community standards and each piece of content on Facebook is usually up in the right hand corner in a way that you can report it to our teams. Then they will look and see if the piece of content violates our policies.
If it violates it, it is removed. If it doesn’t, it stays on the site.
It is important to know that when we are crafting those community standards, we are not just engaging with the teams on Facebook. We do this any time we are thinking about refining policy. The policies are an evolving landscape, just as Facebook is an evolving product and the way that people use the internet is evolving as well.
When we refine these policies, we discuss it internally with a number of teams. We want to make sure we understand what the Australian team is telling us, what the situation is in Australia. We also talk to NGOs, advocacy groups and others who have experienced these issues.
We communicate our standards and what our policies are, and make them easy to report. That is prong number one.
Number two is giving people tools to control their experience on Facebook. I don’t know how many people here in the room are on Facebook, but you may have had the experience of blocking somebody, unfriending somebody or hiding specific content from your timeline. You don’t want to report it; you just don’t want to see it. That’s fine.
The other thing we want you to do is have the control of the audience with whom you are sharing. So you can go into your privacy settings, set your general privacy settings and then you can also go to each piece of content you are sharing, including each photo in each album and specifically adjust the audience, so you are controlling exactly with whom you are sharing.
And then finally, the third prong is giving people the tools to speak up to resolve their own disputes. If people report something to Facebook, it is routed to a team with special knowledge and training to deal with that particular policy.
All of our people who apply the policies when they review the content are trained in all of the policies, and that is not just a one-time training. That is something they go through again and again.
But we also have teams such as our safety team, so they understand for instance that bullying is not just an online phenomenon. In fact, often the context is offline. Things are happening in school, in the community. And with something that’s reported to us as bullying, our reviewers have to understand that they might not have the entire context. They have to make the best decision they can with the context in front of them, but learning about how these behaviours take place and affect people is very important for our reviewers.
And then finally, I thought I would share with you a bit about what we are doing to help people resolve disputes.
We’ve now had this social resolution tool on the site for a few years and to be clear, this is definitely a tool we are continuing to build, so I will present to you a snapshot of how it is right now, but this is a tool we are continuing to improve and roll out.
The basic idea is, when we get a report like this – here is a photo, it looks like a nice photo – somebody reports it as bullying and we don’t have the context to understand why that person feels bullied. We can try to make a guess, but the better option is if we can empower people to actually resolve this themselves.
So we started looking at our reporting flow and when I say ‘we’, I mean the company. I wish I could claim the credit, but I wasn’t directly involved. We started looking at the language used by people when they report something.
And we realised language really matters. If people were saying they didn’t want to see something on Facebook, often it was because they didn’t like the photo of themselves. But when we asked them, ‘why don’t you like this post?’, we realised that the language, even just changing one word, could make a huge difference as to whether they felt empowered to do something.
Free Speech 2014 • Symposium papers • 29 3 Free speech in the digital age
So in the first version of this social resolution tool, we asked people, ‘why don’t you like this photo?’ And it had a drop-down menu with a list of characteristics and one of them was ‘embarrassed’. And about 50% of the people would complete that flow.
We started working with some researchers at Yale and Berkeley that are emotionally rich language specialists and we found that if we made it more conversational and if we said, ‘I don’t like this photo because it is embarrassing’, that one little tweak lead to a much larger percentage of people completing the flow and sending something in.
The other thing we noticed is that we could give people a dialogue box that suggested language they could use to reach out to the person that posted the photo and ask her to remove it.
So when we started this social resolution tool, we gave people an empty box. Basically, the way it would work would be: I post a photo, you don’t like it, you are presented with an option to ask me to remove the photo and we give you a blank box that says, ‘type a message to Monika to say that you don’t like the photo and ask her to remove it’. About 20% of people were doing this.
After working with the researchers at Yale and Berkeley and also testing some different flows, we found that if we provided text, suggested text – and they could change or edit that if they wanted to – with a message like, ‘Monika, this photo makes me uncomfortable. I really don’t like it. Would you mind taking it down?’, then people were more likely to complete the flow.
We were finally starting to understand that there is value in understanding the different ways that people talk in different countries. We are exploring this now, but we are realising that the way people think in India is not necessarily the way people think in Australia when they want to approach somebody about removing content. But our data shows these tools are working and in the majority of cases, when a person receives a message asking her to remove a photo, she will engage in a dialogue and in many cases she will remove the photo voluntarily, which results in a better experience for everyone.
And finally, it is very important to us that people understand that Facebook can be a tool for engaging counter speech. And this is an example I wanted to share with you from the Bullying and Harassment contact.
This was a young high school student in California who was being bullied for a poor soccer performance. In the wake of someone posting mean photos of him missing a goal – he was a soccer goalie – his teammates posted a photo of him with the caption, ‘we are all Daniel Cui’, and it went viral. Suddenly all the students were doing it and other people in the community were doing it.
This is available on our Facebook page and it features Daniel Cui talking about the way this made him feel empowered and feel strong enough to stand up against bullies. So counter speech is very important to us and we have the platform for it.
3.3 Trish Hepworthx Executive Officer, Australian Digital Alliance Topic: Reforming intellectual property
That is clear articulation of the classic liberal approach to free speech. However, if I express it in those words, I risk infringing copyright. Because those words were already used by Mr Tim Wilson,79 back in May this year. If I want to explain the concept, I need to use different words, words that Tim Wilson and others haven’t used before me.
And here is the fundamental tension between copyright and free speech. In giving a copyright owner the right to control the use and reproduction of works, it constrains the speech of others.
I was invited here on the promise I would be practical, not overly legalistic. And what I hope to do is show some illustrations of the areas of tension in Australia and overseas at the moment, in expression and in enforcement.
But firstly, just to ensure we are on the same page, copyright is a property right over the expression of ideas, not the ideas themselves.
x Trish Hepworth is the Executive Officer for the Australian Digital Alliance (ADA), the peak body representing copyright users and innovators in the digital world.
The ADA specialises in copyright policy on behalf of the education sector, internet industries, cultural institutions, libraries, consumers and organisations assisting the blind and visually impaired. She is also the Copyright Advisor for the Australian Libraries Copyright Committee.
So Millet in 1866 doesn’t own the idea of two people sleeping in a field.
Sargent in 1875 cannot stop others from painting two people in a field.
But this acknowledged masterpiece by Vincent Van Gogh in 1890 is most probably a flagrant breach of copyright, taking not just the central idea of painting people in a field, but lifting the entire design.
And nowadays Sargent may have slapped Van Gogh with a lawsuit and collected damages and demanded he destroy that and all other derivative works.
In the last election, a Labor party ad that was based on material screened by the liberal party was yanked for copyright infringement,80 and recently a liberal party ad in Victoria was pulled81 after it used broadcast news footage without permission, while making a point about corruption.
One proposal that has the potential to allow free speech considerations to be taken into account is the recent proposal from the Australian Law Reform Commission (ALRC) to introduce a flexible ‘fair use’ exception.82 Fair use is a flexible exception that asks a court to judge fairness according to four main factors: the purpose of the use, the type of work used, the amount used and the impact of the rights holder’s legitimate property rights.
In the U.S. fair use is considered, in most judicial opinions at least, as the safeguard that balances copyright and free speech – to the point where many judges have declared they don’t consider there to be a conflict.83 This is questionable, especially in regards to third party enforcement, but it would be fair to say that free speech is more considered and protected under the U.S. fair use system than under our current system.
Australia lacks the constitutional protection for free speech that the U.S. has, and that appears in U.S.
copyright cases. However, fair use would give more flexibility to the copyright system to allow uses with important social benefit, such as promoting free speech. It would also assist with many of the current problems for people who cannot access information because it needs to be translated or because permission is withheld.
And it would give some legal support to the way that people are already communicating online.
And we see this in the U.S., where the Digital Millennium Copyright Act (DMCA) provides similar safe harbours. As the number of requests to remove infringing content has increased, the systems have had to become automated to deal with these requests. To give you some idea of the figures, the Chilling Effects project88 keeps records of the rate of take-down
2009 4,275 0 23 33 56 4,387 2010 16,827 307 508 21 65 17,728 2011 67,571 4,138 0 13 76 71,798 2012 441,370 6,64667 0 2 120 448,138 So far, in 2014 Google has been asked to remove 26,895,765 URLs, Twitter 9,199 and Twitter reports 76% compliance, suggesting a high number of inaccurate requests.
Some of the recent wrongful takedowns they note include:
Marilyn Randall and Drew Hansen are amongst the scholars who have noted that Martin Luther King’s famous ‘I have a dream’ speech liberally plagiarised and sampled earlier works.
And the people commenting on the work, in the internet’s recognised language of kittens, face the probability of their work being censored.
We need copyright. In providing the economic incentive to create and distribute many categories of work, it arguably promotes free speech and it adds to our cultural and economic wellbeing. But it needs to rest in a system that also protects free speech. Australia has a chance, with the reviews of copyright exceptions and enforcement, to ensure that we have adequate safeguards to protect free speech and freedom of expression. And to ensure going forward we have the right policy framework to support both people’s property rights and people’s right to free speech.
The next session will focus on the limits of free speech and how it should be protected.