The free and open Internet has enabled disparate communities to come together across miles and borders, and empowered marginalized communities to share stories, art, and information with one another and the broader public—but restrictive and often secretive or poorly messaged policies by corporate gatekeepers threaten to change that.

Content policies restricting certain types of expression—such as nudity, sexually explicit content, and pornography—have been in place for a long time on most social networks. But in recent years, a number of companies have instituted changes in the way policies are enforced, including demonetizing or hiding content behind an age-based interstitial; using machine learning technology to flag content; blocking keywords in search; or disabling thumbnail previews for video content.

While there are some benefits to more subtle enforcement mechanisms—age restrictions, for example, allow content that would otherwise be removed entirely to be able available to some users—they can also be confusing for users. And when applied mistakenly, they are difficult—if not impossible—to appeal.

In particular, policy restrictions on “adult” content have an outsized impact on LGBTQ+ and other marginalized communities. Typically aimed at keeping sites “family friendly,” these policies are often unevenly enforced, classifying LGBTQ+ content as “adult” when similar heterosexual content isn’t. Similarly, as we noted last year, policies are sometimes applied more harshly to women’s content than to similar content by men.

Read the full article:

Blunt Policies and Secretive Enforcement Mechanisms: LGBTQ+ and Sexual Health on the Corporate Web by Jillian C. York

Oct. 24th, 2018