Freedom of expression is one of the most important rights when it comes to societal equality of marginalized groups and social media movements such as #metoo and #blacklivesmatter. Yet, in recent months, social networks’ algorithms – “designed to maintain a safe place for inspiration and expression” – often harm and censor their most vulnerable users. In addition to harassment, violence and hate speech, women’s bodies, nudity, sex and sexuality mostly suffer on account of social media’s algorithmic censorship, replicating the male gaze online.
(Platforms censored 3 out of 12 images for the male category, while censoring 13 out of 18 images for the female category.) This power imbalance within social media has become even more apparent following the approval of FOSTA/SESTA, a regulation that claims to target sex traffickers while actually having devastating impacts on sex workers.
We at CHEEX are trying to build a community for anyone who is curious about sexual stimulation and education, striving to depict sexuality in a way that is diverse, fun and consensual. Instagram is one of our most important channels, in order to build a community and offer our featured artists an appropriate platform that caters to ‘the masses’.
Unfortunately, we keep on encountering limitations: images that vaguely show sexual interactions, female nipples and nudity that is shown for artistic or educational purposes –
the act of blocking a user’s content on social media sites without them noticing, until they see its impact on their account metrics. Unfortunately, shadowbanning targets mostly pages run by sex workers, queer people, and anyone whose content is deemed “unacceptable” as in deviating from the norm. If you are shadowbanned your posts won’t show up for hashtags or your followers feeds.
The only way for people to see your post once it is shadowbanned, is to visit your actual profile. At its best, it would theoretically cut out bot-type accounts or users who violate terms of service to improve the quality of its communities. At its worst, it might be a nearly invisible type of censorship to (accidently) silence certain viewpoints or as a way for companies to insert more sponsored content posts in lieu of real people. Facebook (Instagram) blames any injustices on a glitch in the system regarding hashtags. Some artists affected by the so-called injustices have described this phenomenon as a “secret denouncing” by Instagram.
Images in breach of Instagram’s community guidelines are flagged through a mix of manual reporting and AI tech. Over 15,000 employees are working all around the world to review posts, and look for banned material. As The Guardian put it:
After an artistic topless photo by plus size Model Nyome Nicholas-Williams was taken down by Instagram in June this year, her 60.000 followers rallied behind her. The photo did not, in any way, violate Instagram’s community guidelines. Nyome Nicholas-Williams as a black, plus size woman is one of many people that gets to feel Instagram’s restrictions much more than skinny, white women, let alone (white) men. The incident resulted in a petition aiming to “showcase all people of all sizes and ethnicities” on change.org, collecting more than 20,000 signatures.
In a recent interview Nicholas-Williams stated:
Although the CEO of Instagram publicly acknowledged the fact that Instagram had to look into their policies, not much changed. Since then, plus size influencers from around the world – of diverse racial backgrounds and with millions of followers collectively – posted an image to their grid, containing a search bar with the words:
In Mid-October this year, even more attention was paid to the issue, as a photo by Celeste Barber, an Australian Comedian, mimicking a post by Candice Swanepoel, got taken down by Instagram once again. Celeste Barber immediately reacted to her post being deleted, while Swanepoel’s photo stayed on the platform. Nyome Nicholas-Williams as well as Celeste Barber are now working with the platform to help update their guidelines for the future.
Adult performers are equally (if not more) affected by Instagram’s censorship. Without posting images that violate community guidelines, their accounts get attacked more than any other celebrities or artists. Most adult performers use Instagram to showcase themselves and promote their personal brands. When a performer has their account deleted, they lose access to their fans and business connections they’ve built up – with a potentially significant impact on their income and livelihood.
We simply demand for a more differentiated censorship so that contents which contribute to body positivity, educate about sexual health, show the many different facets of the female body or just offer an aesthetic perspective on sensuality will be allowed to publish freely.
To give an example of Instagram’s community guidelines: While they explicitly don’t allow nudity (“with the exception of women actively breastfeeding, post-mastectomy scarring, as well as nudity in photos of paintings and sculptures”), they implicitly allow men’s nipples to be shown on their platform, yet women’s nipples are banned.
While Instagram’s intention of going against hate speech, violence, sex trafficking and abuse is valuable, their community guidelines leave a lot to the imagination and (might) end up harming the already vulnerable groups of society. The vague and random enforcement of these rules pushes many contents into spaces that are anything but safe. That’s why we would like to invite our community to share cases of biased & unfair censorship on social media.
We’d like to invite our community to share cases of biased & unfair censorship on social media. The existing guidelines are intransparent and oftentimes discriminate against certain groups of people. The vague and random enforcement of these rules pushes many contents into spaces that are anything but safe.
By participating in this campaign, you become part of a movement towards a destigmatized way of dealing with nudity & sexuality.
Use #uncensoredme to participate or to learn more about the campaign.