Online Censorship Through Selective Demonetization

Online Censorship Through Selective Demonetization

Online Censorship Through Selective Demonetization

By Robert Porter (eQuality Project Research Assistant)

YouTube recently made the decision to clarify its policy concerning the “advertiser friendliness” of content creator’s videos, which has led to an apparent increase in the demonetization of online content. Demonitization is essentially the reduction of content creators’ ability to collect ad-based revenue from their content. While this demonetization has been occurring since 2012, it was only recently that YouTube changed its communication to its content creators. Thus, while this policy of demonetization of “non-advertiser friendly” content has been going on for years – seemingly without the knowledge of a large proportion of content creators – this is the first that many have heard about this “new” policy.

So what does “advertiser friendliness” actually mean? According to YouTube’s policy, the content uploader must ensure that the video content, metadata, and thumbnail image contain “little to no inappropriate or mature content.” Furthermore, material that is not considered appropriate for advertising includes: “controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters, and tragedies, even if graphic imagery is not shown.

There are a large number of content creators on YouTube that pick up on current events, news stories, and political developments, all of which can be deemed “controversial or sensitive” depending on one’s perspective. One of the main joys of YouTube is that it originally served as a place when one could upload pretty much anything and share it with the world (provided it broke no laws such as copyright legislation). With increasing censorship, this freedom is slowly being eroded.

Essentially, YouTube is simply adjusting and enforcing its own terms of service, which it is perfectly entitled to do, and while the recent clarification of if its policy has spawned protest hashtags such as #YouTubeisDead, there is little doubt that YouTube will not be overly harmed by this online snafu. However, the real issue seems to be that the enforcement of this demonetization policy is incredibly uneven. While independent content creators seem to be targeted, large corporations such as CNN do not seem to be hit by the same demonetization policy in regards to what content is permitted on monetized videos.

From where I sit, however, there remains the glaring (un)intentional sending of a rather pointed message: this looks like blatant censorship of smaller, independent content creators. Demonetizing an individual’s content equates to censorship. Full stop. Furthermore, demonetizing individuals while allowing large corporations to continue unheeded reeks of the de-democratization of the online sphere. Again, the (un)intentional message is disturbing: “If we don’t like your message, we can shut you down.”

Another major problem is that, whatever algorithm is being used by YouTube, it seems to be both incredibly broad as well as oddly specific. As CBC reported, videos demonstrating educational science experiments such as “Menthos Coke Explosions” have been demonetized under the policy, as have several videos on suicide prevention because “suicide” is one of YouTube’s “problematic” tags. There is no room for nuance, and, as a result, a significant number of creators who are producing content designed for educational or beneficial purposes are at risk of being swept under YouTube’s digital rug.

Of course, the number of content creators who rely solely on advertising revenue from their content on YouTube is incredibly small. However, many users are only able to continue to produce content due to the advertising revenue that they receive based on the number of their viewers and subscribers. When a company has little or no competition, however, it is much easier to yield to the demands of other large corporations (and funders) rather than the “little guys” that produce a vast range of your content.

So why does this matter? Why should we be concerned with whether a massive online corporation limits the perspectives of its users through the reduction of advertising revenues of video producers? It matters because this issue touches on the prevailing use of big data and algorithms to monitor and collect information concerning youth and their online lives.

The amount of data that is funnelled through big data algorithms is increasingly worrying, especially since it tends to trap individuals in discriminatory categories. In this instance, the algorithms are contributing towards the censorship of ideas and opinions online, but algorithms like this could very well be used to reduce individuals to data-figures in discriminatory data-driven systems.

Complications such as this are bound to arise when the public sphere, personal communication, and social media are dragged into the corporate models of commercialization. The big question that remains is what kind of responses will we see in response to the ever-increasingly algorithm-driven world?

Inequality in Gaming

Inequality in Gaming

Inequality in Gaming

By Trevor Milford (eQuality Project Research Assistant)

On 15 September, professional gamer and game designer Stephanie Harvey came to the University of Ottawa to discuss her experiences in the gaming industry. As a doctoral candidate working on issues involving discrimination in gaming, I was particularly interested to hear Stephanie’s insights on how inequality and virtual harm impacted her livelihood. I’d heard about the event through a promotional article entitled “Ending cyberbullying is everyone’s responsibility” and was familiar with some of Stephanie’s work on spreading awareness about gaming-related gender inequality. Perhaps most notably, she is known for founding MissCliks, an initiative committed to ensuring that “people of all genders can participate in geek and gamer culture without fear of prejudice or mistreatment”.

I was excited to hear Stephanie’s insights, but perhaps even more excited – as I always am – that issues of inequality in gaming were being given a public platform. Many of us recall the relatively recent GamerGate controversy that brought gaming-related discrimination to the forefront of public consciousness. For social scientists like myself, GamerGate can be used as an inroad to bring discussions about inequality in gaming not only to academia, but also to the general public. I was eager for Stephanie to talk about her experiences with misogyny in gaming, both in hopes of applying what she said to my own research on GamerGate and because of the sheer importance of talking about the reality of these problems. Talking about discrimination in the abstract can be a tough sell – it’s an honour when people are willing to talk about their experiences and bring research on discrimination to life, making what we do as academics more “real” and impactful. Having these discussions at postsecondary institutions is particularly timely and important since cultures of misogyny are deeply entrenched on university campuses locally, across Canada, and across broader North America.

Talks like Stephanie’s are a great way to combat discrimination in gaming while illustrating its harms and impacts through a topic that’s engaging for audiences, and for students specifically. While roughly the same percentage of men and women play games, about 3/4 of game developers are male. Females in the gaming industry and industry-related media have received targeted death threats and other threats of violence, have had their personal information publicly “doxxed” by online harassers, and have felt compelled to flee their homes because of online aggression. Although I could continue, in the interest of saving space it suffices to say that it’s difficult to ignore the undercurrents of gender inequality and misogyny run through the gaming industry.

Stephanie shared that misogyny is something she has encountered throughout her career, referring to the gaming industry as a “boys’ club” where women are seen as disruptive to a male-dominated status quo. Women, she offered, are commonly seen by professional gamers and by industry insiders as “creating chaos” by threatening gaming’s patriarchal foundations. She mentioned that she and women she knew had witnessed a range of misogynistic beliefs both from coworkers and from fellow gamers. These include beliefs that women create more job competition, distract men, are less skilled than male players and “bring down” quality of gameplay, and are fundamentally changing gaming itself in a way that is inherently negative – namely to be more inclusive.

Stephanie also linked misogyny in video gaming to the fact that most gaming happens online and that the Internet can enable anonymity (despite some studies suggesting that anonymity actually makes online comments less extreme or contrarian). She described many harassers in games as “keyboard warriors” or “trolls” who wouldn’t behave similarly offline, paralleling harassment in gaming with harassment in online chat rooms, discussion forums or anonymous social networks. The anonymous nature of virtual harassment can make these behaviours seem less real even though they have been shown to have very real impacts for victims. This association means that misogyny both online and in video gaming can come to be seen as unserious, unproblematic, endearing, or funny, ultimately silencing serious discussion about the harms of virtual gender discrimination.

I was anxious to hear more insights and to tease apart a couple of Stephanie’s claims that I found problematic – for instance, that “men are genetically better” at certain types of games, that “men are more social”, and that “in social life men are really good at competition” (is the implication that women are not?). However, about ten minutes after she began to speak about women in gaming, Stephanie’s allotted discussion time reached its end. Since audience questions had been interspersed with Stephanie’s talk, much of it until this point had been directed by queries from the audience. It wasn’t lost on me that save for the two questions that Stephanie had time to answer after starting to speak specifically about women and gaming, all but one audience question had been from male audience members. None involved gender, or even discrimination more generally (except for one question posed by a male in the front row – “How do you feel about female gamers using their bodies to make a living?” – posed as if this were something problematic and shameful. To be fair, Stephanie’s answer was “I’m all for it”). There was no critical discussion about GamerGate, very limited discussion about harassment or bullying, and little discussion about how the inequalities for which Stephanie works to raise awareness could be resisted or mitigated.

I was taken aback that no one really seemed to care about discrimination or inclusivity – not necessarily in a radical feminist “let’s overhaul the game industry!” sort of way, but even just in a way that hinted they were willing to discuss these issues at all. Instead, the audience largely seemed focused on issues such as how to make money off gaming professionally, what made X game such a great game, or how to secure prestigious positions as game developers. I was surprised that there was evidently limited interest in more critical lines of inquiry than the venture capitalist potentials of gaming. Of even more concern, it seemed lost upon the audience that they were enacting patriarchy by dominating a discussion that had been advertised (and even reported on afterwards) as about (gender) discrimination in gaming to solicit get-rich-quick tips. For me, this illustrated a key problem: patriarchy is normalized. And because it’s normalized, we can fail to recognize it, fail to recognize the harms it creates, and fail to recognize when we perpetuate it. This is true whether our terms of reference are the gaming industry, a university campus, or society in a more general sense. So, readers, I implore you: please help to de-normalize it in whatever way you can. Maybe by making a game about it. …I’ve heard you can get rich quick.

 

Professor Ruparelia wins Award for the Scholarship of Teaching and Learning

Professor Rakhi Ruparelia was recently awarded the 2016 CALT Award for the Scholarship of Teaching and Learning for her article “Guilty Displeasures: White Resistance in the Social Justice Classroom.” The paper, which can be read here, is focused on the experience of being a racialized women teaching white students about Critical Race Theory and the complicated emotions experienced by students.

 

Talking Not Spying

Talking Not Spying

Talking Not Spying

By Jolene Hansell (eQuality Project Research Assistant)[1]

GoGuardian is a program installed in about 3 million school-owned computers. This program has the ability to monitor a students web-browsing and searches even when students are at home in the evenings or on weekends. The program automatically flags certain search terms, including those related to suicide. The idea is that when a student searches about suicide, the computer flags this search for the school’s IT director who can then call up the student’s browsing history to get a more detailed picture of what the student is going through and get the student assistance if needed.

Sounds like a great way to prevent suicide, right? But the situation may be more complicated than it first appears.

My main problem with this model is that it perpetuates the stereotype that there is something stigmatizing about mental illness. The student, who may not feel like they can talk about their struggle with their own mental health with anyone, is using the anonymity of the Internet to get information. When the school invades the student’s privacy to get access to their online browsing history, they perpetuate the societal notion that there is something shameful about the way this student is feeling/what the student is searching. This vicious circle continues to push mental health issues into the dark corner of things we are not prepared to talk about in our society.

Suicide is the second leading cause of youth death. It is one of the biggest issues facing our world today. I have no doubt that the intentions of GoGuardian are good—trying to reach out and help individuals struggling with their mental health before they become a suicide statistic is a noble objective. But further stigmatizing mental illness makes the problem worse, not better.

The best tool we have in the fight against suicide is conversation. Every day we are bombarded with things we need to do for our physical health—eat right, exercise, get a good night’s sleep—but we are less apt to discuss the things we do for our mental health.

Mental health needs to be part of our daily conversations; it needs to be okay to say, “I’m not okay”. Rather than employing technologies that invade a student’s privacy, schools should be incorporating conversations about mental health into their daily classes. By facilitating this conversation, schools will create an environment where individuals who are struggling with their mental health will feel comfortable to speak up and ask for help, and remove the need for monitoring technology altogether.

[1] Jolene Hansell is Vice President of the Paul Hansell Foundation. The Foundation supports programs aimed at promoting the emotional and well-being of youth and works to include the mental health conversation in our daily lives.