top of page
Search
wz1340

Phase 2, Final Project



Video Link: https://youtu.be/Bng6MXLAek4


Annotated Bibliography:


-“Will Instagram Ever 'Free the Nipple'?” The New York Times, The New York Times, 22 Nov. 2019, www.nytimes.com/2019/11/22/arts/design/instagram-free-the-nipple.html.


In this New York Times article, Julia Jacobs talks about the ways that photographers and artists have evaded the Instagram censor on the female body through means such as coloring over the nipples and buttocks or obscuring the view through other means in photography and digital manipulation. Jacobs also talks about the bias that the algorithm has when comparing photographs of the male and female body, stating that unlike the male chest, "female chests were considered to be an 'erogenous zone.'" Instagram and Facebook, who owns Instagram, have defended their practices in censoring such images, by arguing that their platform is trying to address the diverse cultures and beliefs of people around the world, much to many artists frustration.


The article introduces the disparity in judgment that the Instagram censorship algorithm has between images of men and women, referencing a frustration that guest speaker Max Evans expressed duirng lecture regarding this difference in treatment. It also outlines the difficulties that many artists and photographers have when it comes to their work, as this censorship restricts their freedom on what they want to produce for their work. Through the use of these Instagram censors, the creative abilities of artists get hampered, which affects what kind of work they show people and what kind of message they want to send out.


- Faust, Gretchen. “Hair, Blood and the Nipple.” De Gruyter, Transcript, 31 Dec. 2017, www.degruyter.com/view/book/9783839434970/10.14361/9783839434970-012.xml.


This report by Gretchen Faust further dives into the issues outlined by the New York Times article earlier, exploring more examples of the inherent bias that the Instagram censorship algorithm shows towards women, revisiting examples such as censorship of nipples, but also introducing new ones such as unshaven pubic hair and menstrual blood. Faust also goes onto question why images of the female body are flagged by Instagram's algorithm as inapporpriate but images of things like guns and other violent acts are, outlining the outrageous fact that Instagram's algorithms and society are alright with terrible things like violence, but for some reason find images of the female body not alright.


This further compounds on the point that women are consistently seen on the receiving end of algorithmic bias, with Instagram being one of many platforms where such a bias is seen. It shows that it's not necessarily restricted to nudity, but other minor things such as the unshaven pubic hair and menstrual blood. While Instagram might claim that its actions only serve to maintain and respect different cultures and beliefs, the fact that violence is allowed but expressing yourself is not seems to contradict their initial mission for their platform.


- Pater, Jessica A., et al. “‘Hunger Hurts but Starving Works’: Characterizing the Presentation of Eating Disorders Online.” "Hunger Hurts but Starving Works" | Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 1 Feb. 2016, dl.acm.org/doi/abs/10.1145/2818048.2820030.


In this report writen by various professors across several universities, they examine the presence of various eating disorders such as anorexia and bullimia on social media platforms and how their censorship algorithms react to those posts. One of the interesting areas in the report was where they show a list of keywords that many social media platforms use to detect signs of these behaviors and if they see some of them closely together, it will remove said posts from a person's wall.


Eating disorders and suicidal thoughts are prevalent issues that can occur in many people, especially in the younger audience and community on Instagram. The article outlines the idea when talking about the censorship of posts like these. While it may help stop the spread of such thoughts throughout the minds of teens, it also prevents the people posting these from getting the attention and help that they need. For many individuals, its a lot easier to communicate their thoughts and ideas to a group of accepting peers, so by removing this avenue to communicate, it becomes harder for them to find someone to talk to. By pushing away suchs posts from people, we remove them and their issues from coming to light in the community, which can put said individuals into further danger, and should be an issue that censorship algorithms need to address.


- Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press, 2018.


In this well-known book by Safiya Umoja Noble, the inherent bias behind search engines, mostly Google in this case, as well as its effects on enforcing racism are examined. Within the book, Noble presents various of examples of Google displaying various racist autosearch results as well as images misrepresenting black women. She uses these results to relate it to the rampant rise of racism and sexism, claiming and proving that the search results provided by search engines like Google were in fact racially biased and they propogated racism to people who saw the images and autosearch results.


Despite the book mainly using search engines as its prime example, the same can be said about censorship on social media. In today's world, a person's opinions and thoughts about the world and other subjects are heavily influenced by what our peers post on social media. As a result, what we censor to people have a great effect on what users think. If we maintain our current method of censoring posts on Instagram, we may be able to reduce the number of people trying to imitate the attempts to lose weight by starving or people expressing suicidal thoughts. However, we also make people more ignorant of these pressing issue within people who need the help and the people who need the help won't be able to get that help they need as easily. The parallel between social media and search engines shows that when implementing these kinds of things, we need to be wary of whatever social engineering might happen as a result and how it will impact people's lives and perceptions of matters such as eating disorders, suicide, racism, and much more.


12 views0 comments

Recent Posts

See All

Phase 2, Post #11

While technology has helped many places in the city urbanize, it can also cause problems for other places as well. In the presentation...

Phase 2, Post #10

During this presentation given by guest speaker Chris Woods from the NYU LGBTQ+ Center, we learned about the meaning of the acronym...

Phase 2, Post #9

In the presentation given, we examined the prison education program offered by NYU. During this talk, we examined the programs mission to...

Comentarios


Post: Blog2_Post
bottom of page