Social Media: YouTube, Facebook, Twitter, Instagram–innocuous as they seem, their policies of what counts as free speech and what counts as banned content determine what people can see; depending on who and where they are.
An uploaded video of a democratic protest in Tibet might be viewed by any computer or phone–except the ones in Tibet or China. There are a lot of articles about Facebook and Google’s toleration of censorship and tyranny, but “The Secret Rules of The Internet” is about you and me when it comes to free speech online. It’s about the thousands who watch uploaded videos of child abuse so that we don’t have to. It’s also about the controversy over corporations who are officially stamping user content as offensive, without input from the public, the police, or elected officials.
A watershed moment came in the case of a YouTube video of a pro-democracy protester shot and killed by government forces in Iran in 2009:
“Mora-Blanco and her colleagues [at YouTube] ultimately agreed to keep the video up. It was fueling important conversations about free speech and human rights on a global scale and was quickly turning into a viral symbol of the movement. It had tremendous political power. They had tremendous political power. ”
But that was Iran, a country with very different attitudes toward sex, religion, pornography, and violence than America. Censorship, or as the article calls it, an approach toward internet “moderation,” all started with a few dozen employees at YouTube. They were basically getting paid to watch porn. I mean, the videos that people flagged as porn. Sometimes the vids were, and sometimes they weren’t. It depends how we define pornography–something Mora-Blanco and YouTube hashed out by themselves apparently. The big problem though is one person’s definition is not another person’s. It can depend on your cultural background and your opinion on social tolerance.
“No booklet could ever be complete, no policy definitive. This small team of improvisers had yet to grasp that they were helping to develop new global standards for free speech.” It’s quite a pickle, and the problem grows by about 400 hours of uploaded video per minute on YouTube alone.
In 2007…they barred depictions of pornography, criminal acts, gratuitous violence, threats, spam, and hate speech. But significant gaps in the guidelines remained — gaps that would challenge users as well as the moderators. The Google press office, which now handles YouTube communications, did not agree to an interview after multiple requests.
Silence says a lot Google; silence that cries out “Yeah, um, about that.” Or as the article puts it: “Very little is known about how platforms set their policies — current and former employees like Mora-Blanco and others we spoke to are constrained by nondisclosure agreements.” Good old Non-discs. Like Ultimate frisbee in college they’re how you get away with murder. Were frisbee murders not a thing at your college? Weird. I should upload a video of that–oh wait.
When things got too big for a small American team to handle, Google did what every growing corporation does: outsource!
In an October 2014 Wired story, Adrian Chen documented the work of front line moderators operating in modern-day sweatshops. In Manila, Chen witnessed a secret “army of workers employed to soak up the worst of humanity in order to protect the rest of us.”
You see, innocent people much like myself an hour ago, social media is not like a hamburger that we pay for–thereby generating a contract with the restaurant for return patronage in exchange for continued cholesterol victuals. In fact, “users are not so much customers as uncompensated digital laborers who play dynamic and indispensable functions (despite being largely uninformed about the ways in which their labor is being used and capitalized).” It’s something your tech savvy friend might have been telling you at every dinner party that ever went on for more than two drinks.
Your “likes,” retweets, photos, links, all of it is your freely given personal preferences. Companies, like say Hamburger Time Inc. Transnational, will pay Facebook a lot of money to have their hamburger ad put right on your timeline. You snapped a pic of a hamburger after all, captioning it “AMERICA HELL YEAH. NO VEGANS IN THIS HOUSE!” So Facebook is pretty damn sure they can hook up Hamburger TIT with a likely consumer base like you.
Enter legal jargon for justification: Communications Decency Act’s Section 230(c):
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” These 26 words put free speech decisions into private hands, effectively immunizing platforms from legal liability for all content that does not violate federal law, such as child pornography. All the checks and balances that govern traditional media would not apply; with no libel risk there were, in effect, no rules.
Returning to to outsourcing, you know what makes a much better worker? Yes, technically golems summoned by arcane magics, but besides golems. Thats right, a robot.
Whenever an image is uploaded, whether to Facebook or Tumblr or Twitter, and so on, he [Hany Farid] says, “its photoDNA is extracted and compared to our known CE images. Matches are automatically detected by a computer and reported to NCMEC for a follow-up investigation.”
PhotoDNA helps to curtail uploads of child abuse, but it doesn’t stop anything else (yet) nor does it differentiate between actual child abuse and say, an anti-abuse advertisement, or a satirical comedy sketch about child abuse (not that that would be a great idea to pursue). In fact, thousands of YouTubers have complained about getting “strikes” on their accounts for being be automatically flagged by a machine. These people’s source of income is uploading videos to YouTube, but with automatic strikes on their account, they have to perform the lengthy process of getting their videos unstricken, or worse, unbanned. Either way, the process sounds about as appealing as removing duct tape slowly from every hairy surface of your body.
So next time you google “Syrian civil war,” or more likely “Funny kitties,” consider all the things Google is not showing you. Maybe for your own good, or maybe because they don’t want to be liable for anything remotely controversial. Not that you’ll ever know, because Google isn’t going to tell you their policies on a case by case basis. Even in the case of the killed Iranian protester, YouTube’s excuse for keeping the video was that its competitors would capture their lost web traffic.
As for Mora-Blanco, she quit her job for YouTube/Google. Now she works as a hair stylist.