Now You Done Pissed Me Off: Why Facebook Needs To Change
There’s this thing called Facebook that more people on this planet use than have clean drinking water. That’s clearly speculation on my part, but I doubt I’m far off on my statistical comparison. Everyone is on Facebook, and by “everyone” I mean “even my grandma.” Not every last person on earth is an active participant on the website, but I’d gather to say that even if every person doesn’t have an account, there is a photograph of them posted on someone’s account. Even my dad. Even my grandpa. They, too, are on Facebook.
Let’s get past the whole if-you-don’t-like-it-don’t-use-it non sequitur because it’s made itself a part of our culture and a part of both our personal and professional lives. It’s easy to use, accessible to everyone (see above), and is generally helpful in providing a virtual community center. I like Facebook; I use it daily. Facebook provides a platform for me to use to promote my business and myself. Though the multi-billion dollar company doesn’t charge for its services, Facebook is not shy about asking for money to promote business-related postings and sidebar advertisements.
Would we pay for a service like Facebook? Yes, but not in its current state of habitually and blindly restricting users’ accounts being inaccurately targeted by other users with a vendetta.
It’s well known that Facebook is vague with its Terms of Service and Privacy statements. They aren’t the only ones, but they’re the most widely known. It’s doing purposely as CYA (Cover Your Ass) legalese, and when you’re the most-used social media website in the world, you get to do things like that. But it’s those vague Terms of Service that can both help and hurt its users.
My Short Story
I run the public Facebook Page, Smut Book Club, as well as a Secret Facebook Group, Smut Muffins. The Secret Group is set up so that no one outside of the group is even aware of its existence. Unless a Facebook friend directly invites you to the Group and then you are approved by a Group admin, or you provide your email address to a Group admin, you cannot see any of the activity of the group, access its page, or see who is in the Group.
The women (and only women) in the group are there to talk freely about themselves. There are over 1800 members currently, and I make it very clear about the intentions and guidelines to everyone entering. This is part of what every member has access to seeing as a “Pinned Post” within the group:
What kind of sexy pictures can I post? Any picture that doesn’t show straight-on peen, straight-on lady bits, and full-on lady nips. If you post something I or the other admins feel is questionable, we’ll remove it. You’re not in trouble and won’t get kicked out unless it becomes habitual. When in doubt, don’t post.
Even with those guidelines and strict monitoring by admins, pictures that would qualify as R-rated are being reported for pornography and violating Facebook’s Terms of Service.
The problem? The images being reported are not pornography. They do not violate the site’s Terms of Service. Yet, either someone within the group has infiltrated its membership and is targeting it for their own reasons, or Facebook’s algorithm is doing the reporting.
I’ve tried contacting Facebook to get an answer to why members are being reported for non-violating images, but because there is not a direct way to contact anyone, my hopeful throw of a call for help at a general email box will probably go unnoticed and unanswered.
What Others are Doing and Their Stories
I’ve collected articles written by others—even the ALCU—who have these same issues, and while they are in no way the whole of the explanation as to what is happening and why it’s important to change, they provide some insight into the problem. It also helps to know that We Are Not Alone.
Naked Statue Reveals One Thing: Facebook Censorship Needs Better Appeals Process, Lee Rowland, ACLU<
When the ACLU published an article on its Facebook page about a censorship debate in Kansas where a bronze statue was displayed with bare breasts, the image and the post were removed without access to appeals.
(L)ike all censors, its decisions can seem arbitrary, and it also just makes mistakes. If Facebook is going to play censor, it’s absolutely vital that the company figure out a way to provide a transparent mechanism for handling appeals. That’s particularly true when censorship occurs, as it so frequently does, in response to objections submitted by a third-party. A complaint-driven review procedure creates a very real risk that perfectly acceptable content (like…you know, images of public art) will be triggered for removal based on the vocal objections of a disgruntled minority. A meaningful appeals process is, therefore, beyond due.
Facebook’s initial response to content should always err on the side of leaving it up, even when it might offend. After all, one person’s offensive bronze breast is also one of Kansas’ biggest current media stories.
An anonymous Facebook community moderator explains how the innocuous reporting of images and content on Facebook is hurting people who can actually be helped with responsible reporting.
Try to understand that for the most part, your bannings are carried out by an automatic algorithm and only a very small percentage of them handled by an actual person. Why is this? While Facebook can boast that they have nearly 1.5 billion users this year, the impressive feat is that they have 1.2 billion active users who log in nearly every day. Each one of those people have a different level of what offends them.
Considering all of these things, I’d like to tell you what offends me. It offends me that nearly 80% of the manual reports I have to read are from people who find themselves offended by something. A recent list of offenses include: (Translated from the the poor excuses for grammar and spelling most of them arrive in)
- “I don’t believe in this coin. It goes against what I believe in.”
- “This fish doesn’t look like a fish, it looks like a man’s private parts, and I have small children around me during the day.”
- “I saw this on my feed and I do not approve please take it off.”
- “This page shared my picture without credit or permission.”
- “Can you remove this picture I don’t like it.”
- This is not true, my God would never let this happen.
Now, forgive me or not, but personally I can assure you of this. In the face of the actual reportable issues that come in at the rate of 250k per hour, I really don’t give a shit if coins offend you, or if you can’t keep your ass off Facebook while you babysit the neighborhood children. What I care about, what really, truly fucking offends me is the fact that after 5 hours of reading through your butthurt sob stories, I find the 7 year old girl who, too scared to tell an authority has figured out how to report a picture her uncle posted of her that was blatantly, obviously sexual, and begged for help. “He says he’s coming over today at 2. Can you please help me, I don’t know what to do.”
“Why did Facebook never respond to my report or my ban issue?”
I’ll be blunt here. Overtaxed system, automatic algo ignored you. OR, not enough fucks given about your issue if manually modded.
Does Facebook even see appeals or emails sent about my page or personal issue?
Again, bluntly, no. Facebook keeps and sometimes rotates out, one-single appeal email address that lands on someone’s list of things-to-do. The rest of them just reply with automated messages about how they might help, they might not. They won’t.
The Facebook page created by the anonymous Facebook community moderator mentioned above.
This page is for those who have issues with Facebook’s reporting system. Admins of pages not constantly breaking the TOS that are harangued ritually on Facebook should private message us to request support.
Michael Stokes Photography, Michael Stokes
The famous photographer of nude and semi-nude athletes has been repeatedly banned for posting his art, which, I must note, does not show genitalia.
Facebook Wages War on the Nipple, Lina Esco
Lina Esco chronicles her recent restriction from Facebook for showing the female nipple in the promotion of her documentary on female nudity and its role in society.
One fact that’s absolutely certain, it’s not the boob, but rather the nipple that really freaks out Facebook, Instagram and the other platforms that banned our film. There’s actually a FB page called “Free The Boobs” which is 100% acceptable for their “community standards” as long as not one millimeter of areola is visible. This policy seems odd, as images on “Free The Boobs” are hyper-sexualized, and the topless images on our banned profile “Free The Nipple” were of protest — baring no sexual context.
After examining Facebook and Instagram’s “community standards” to make some sense of my cyber exile, I discovered you can post videos of people being tortured and killed, but a woman’s nipple is too obscene for their standards. You can buy guns on Instagram and show a mutilated body, but a female areola in a non-sexual context is a “violation of their terms” (FYI Facebook bought Instagram for a billion in stock and cash back in 2012).
We Must (Not) Surrender, Jesse Jackman
After his photograph of him and his husband kissing was reported for violating Facebook’s “Community Standards,” Jesse Jackman received death threats from Facebook users and he was subsequently restricted from the site.
On the evening of Oct. 7 I posted a photo on Facebook of me kissing Dirk at the gym. The caption was optimistic and upbeat: “I remember a time when I was so closeted that I was terrified of expressing my sexuality in any public place anywhere, let alone a bastion of masculinity like this place. I’ve come a long way I guess.” The photo quickly amassed nearly 3,000 likes and 150 shares — as well as one particularly threatening comment: “All you faggots must die.” I reported the comment to Facebook — all death threats should be taken very seriously — and posted a second photo of me kissing my husband. The caption on the second photo read, “To [the person who threatened me], who apparently wishes death upon me and Dirk and all the other ‘faggots’ out there, I give you this: another pic of me kissing my amazing husband. You can’t win, dude. Love conquers all.” The photo quickly garnered over 6,000 likes and 250 shares — along with several more threats like “You should be shot,” “Go kill yourself,” and “ill rip yr fffing head off.” But before I could report these latest menacing comments, my second photo was unaccountably removed by Facebook for “violating community standards,” and I was banned from my account for 12 hours.
Facebook’s Reporting Algorithm Abused by Antivaccinationists to Silence Pro-Science Advocates, David Gorski
David Gorski explains how Facebook’s reporting algorithms are being used by groups to harass advocates of a differentiating opinion.
I was reminded of [Facebook’s policies] about a week ago when Dorit Reiss (who has of late been the new favored target of the antivaccine movement, likely because she is a lawyer and has been very effective thus far in her young online career opposing the antivaccine movement) published a post entitled Abusing the Algorithm: Using Facebook Reporting to Censor Debate. Because I also pay attention to some Facebook groups designed to counter the antivaccine movement I had already heard a little bit about the problem, but Reiss laid it out in stark detail. Basically, the merry band of antivaccinationists at the Australian Vaccination Network (soon to be renamed because its name is so obviously deceptive, given that it is the most prominent antivaccine group in Australia, that the NSW Department of Fair Trading ordered the anti-vaccine group to change its misleading name) has discovered a quirk in the algorithm Facebook uses to process harassment complaints against users and abused that quirk relentlessly to silence its opponents on Facebook.
So what can we do? First of all, this needs to be publicized. Whether it’s the AVN doing this (as seems very likely based on strong circumstantial evidence) or other groups of antivaccine warriors, this needs to be publicized. I don’t know whether complaining to Facebook will do any good or not, but it nonetheless has to be done at the Feedback link. It might also help to report this as a bug. The key is that we have to get a lot of reports; otherwise it’s unlikely that Facebook will notice. In the meantime, the only other suggestion I can think of is never to use the name of a known antivaccine activist, in particular the names above that were used to get people temporarily banned. It’s a matter of self-preservation.
Social media sites like Facebook are a very useful tool for community building and disseminating information. However, the can be abused, and that is what appears to be occurring here. There might come a time when Facebook ceases to be useful because its reporting algorithm is too easily abused. Here’s hoping that the management of Facebook can be made aware of that.
Marilyn Mckenna has been documenting her 120-pound weight loss online all along, so she was shocked when Facebook made her take this photo down.
What DO we do? I wish I knew.
Spread the word? Don’t report innocuous things that could be settled by simply clicking elsewhere? Hide things we don’t want to see?
For now, we keep on keepin’ on, and with attention to this issue, we wait for things to change.