top of page
Search
  • wz1340

Phase 2, Post #6

In the presentation that was given today, we had two key speakers. The first one was Chancey Fleet, who talks about the uses of tactile objects to help people with visual impairments to understand diagrams and other thing as well as her experiences in interacting with technology as a blind person. One of the key things that came out of her talk was how there was a "dark pattern of accessibility" when it came to social media platforms. She cites Twitter as her primary example, and explains that while Twitter does have a functionality that assists people with visual and hearing impairments by providing descriptions of images or of what people wrote, you need to manually opt into it first since by default, the option is disabled. This makes it extremely difficult for someone who is blind, since how would they know where to tap to enable that option if they can't even see in the first place. In addition to that, once you enable said option on Twitter, Fleet reported that for her and many others, out of hundreds of images they may have on their Twitter feed, only one out that hundred are described to them. This dark pattern in accessibility that is being described here by Fleet seems to suggest that while you can strive to make your platform more accessible for people with visual and auditory impairments, if you don't have an option that makes it readily availble for them to use it in a simple and reliable way, then it largely ends up being wasted and not helping at all.


The second key speaker was Max Evans, a transgender mechanical engineer who discussed the interactions that social media has had with the LGBTQ+ community and how companies handle problems when things go out of hand with them. According to Evans, what most tech companies will do, especially with the censoring algorithm, is that they create this large and complicated algorithm that even they don't have a solid grasp on and when problems arise in their platform when people's photographs and images get wrongly blocked or censored, what they like to do is "cede their authority to the algorithm." In other words, instead of taking responsibililty for what happened because of their algorithm, they tend to just shrug it off and say things along the lines of "It's just our algorithm." While it may be true that algorithms are not always perfect, Evans states that this practice of just shoving the blame of its the algorithms fault and not theirs is an irresponsible practice, one that is unfortunately very prevalent in most tech companies today and is an unhealthy mindset to have. As a result of this kind of negligence towards the algorithms being created, there exists this form of algorithmic bias that can arise and create problems with inclusion of minorities.


While technology may be becoming more and more advanced every year, it is important, as outlined by our two guest speakers to upkeep the pretense of inclusion and accessibility towards everyone, including the people with disabilites and the LGBTQ+ community. They are all a part in our lives and its important that everyone needs to be treated well in the face of technological advancement and daily use.

5 views0 comments

Recent Posts

See All

Phase 2, Post #11

While technology has helped many places in the city urbanize, it can also cause problems for other places as well. In the presentation given by Michael Higgins, one notable and extremely familar examp

Phase 2, Post #10

During this presentation given by guest speaker Chris Woods from the NYU LGBTQ+ Center, we learned about the meaning of the acronym LGBTQ+ as well as explored more terms and definitions when referring

Phase 2, Post #9

In the presentation given, we examined the prison education program offered by NYU. During this talk, we examined the programs mission to give the incarcerated the abilities and skillset to hopefully

Post: Blog2_Post
bottom of page