top of page
Search
  • wz1340

Phase 1, Midterm Project: Advertising Inequality

Updated: Mar 12, 2020


Whether you're looking through your daily feed or simply scrolling through your favorite group, chances are you've ran into advertisements while you're on the Internet. As a major source of revenue for many large companies, advertisements are quite literally all over the Internet. Facebook is just one of many companies that do this, running millions of ads to their users every minute. However, what one person sees on their page can greatly differ from another. Facebook's algorithms determine what ads are shown to their uses based on their demographics, which can cause great disparities in what people can do and how they react to content.


These disparities in what people see on their Facebook feed everyday greatly affect everyone, but the effects are most noticeable when comparing advertisements across different genders. According to a recent statistic, Facebook users are composed of approximately 43% female and 57% male. In addition, amongst all the people on the Internet, 83% of all females and 75% of all males use Facebook (Aslam, 2019). From these statistics, it is clear that regardless of what gender you are, Facebook is a regular part of people's day to day lives. However, even with such a diverse user base, the users can and will get treated differently by Facebook's algorithms, when seeing suggestions for ads.


While seemingly a small problem at first glance, algorithmic bias is something that raises red flags for many diversity issues, and gender is one of the areas where we need to pay close attention towards. The statistics may suggest diversity, but the treatment from both the men and the machines lead us to suggest otherwise.


To understand how Facebook engineers inequality with their ad suggestions, it's important to understand how Facebook determines what ads they show their users. According to their official website, Facebook selects ads based on factors such as what posts grab your interest, the personal information you provide on your Facebook profile, your activity on Facebook owned products like Messenger and Instagram, and data that advertisers have on you (2020). The people who put their ads up can also specify which group of people they would like their ads to be shown to, a business practice commonly known as hypertargetting. Facebook Ads are great for both Facebook and businesses as Facebook gets money for hosting the ads, and businesses can reach out to their target consumers much easier, allowing for their businesses to grow. In the third quarter of 2019, Facebook reported that "over seven million active advertisers were using the social networking platform to promote their products and services, up from six million advertisers in the first quarter of the previous year" (Clement, 2020). With the sheer volume of ads that are being sent out every minute on Facebook, it's near impossible to get exposed to all of them. However, even if we don't get exposed to all seven million ads that exist out there in the Internet, we shouldn't be continuously getting exposed to the same old ads that we regularly see on a day to day basis. The ads we see play a large role in how we think and believe, and hypertargetting ads only serves to fuel a perpetual cycle of tunnel vision with our thoughts.


Gender is just one of many contexts in the debate regarding algorithmic bias in Facebook's advertisement system. Race, age, nationality, and other salient social identities have been brought under the same context regarding algorithmic bias in the past. When new cases of such bias pop up, it is best dealt with immediately, as prolonging the problem will project the idea of such bias onto users, which may lead into a cycle of perpetuating the bias, then seeing more ads to reaffirm it. Such an issue can be very serious in an age where information and ideas on the Internet spread like wildfire, so when there is a case of bias, it is heavily looked down upon by the affected groups.


For example, a study done at the Massachusetts Institute of Technology examined the differences in job advertisements seen by male and female users of Facebook. In their study, they discovered that jobs in nursing, education, and secretarial work were more often shown to female users of Facebook while jobs involving things like lumberjacking and artificial intelligence were recommended more to male users of Facebook (Hao, 2019).

The results genererated above suggest that in the eyes of Facebook, men are more suitable for hire to more technical fields like lumberjacking and artificial intelligence while females are better off in medicine and education. While this does not necessarily that the Facebook algorithm is biased towards women, the results do suggest that Facebook's method of deliverying ads are only further worsening the bias in the jobs that women would typically undertake. By letting this practice of ad distribution happen, it's likely that this inequality portrayed in the ad delivery algorithms will project itself onto the real world, making the gender gap much wider which in turn leads to more advertisments from businesses to perpetuate this bias and inequality in an endless cycle. Such a concern was seen and banned in the United Kingdom, where they banned certain television commercials for certains products since they all reinforced a gender stereotype. In an interview, the chief executive of the Advertising Standards Authority (ASA) reported, "Our evidence shows how harmful gender stereotypes in ads can contribute to inequality in society, with costs for all of us. Put simply, we found that some portrayals in ads can, over time, play a part in limiting people's potential" (2019). While some people may argue that people are being oversensitive of the situation, it's much more failsafe to address the problem early on at the root rather than let the bias perpetuate itself further and cause irreversible damage.


Unfortunately, not much progress has been made throughout the last year by Facebook regarding the gender bias exibited by their ad delivery algorithm. In a New York Times article examining the allegations against Facebook, they said that they "generally did not take down job ads that exclude a gender" (Scheiber, 2018). Rather than go to lengths to address this bias, Facebook chose to completely ignore it.

To solve this issue in perpetual algorithmic bias in the algorithm used by Facebook Ads, I belive we should first and foremost, limit the use of hypertargetting by businesses. While business are free to decide who they prefer to hire, it is also important that businesses have their priorities first and foremost on hiring people who are skilled, not whether they are men or not. In addition to action from the businesses, I believe that the general public need to more actively and more fiercely voice their disapproval in actions taken by the company. As users of Facebook, we should have control over what we all see on our pages and advertisements are no exception. If we were to ignore the issue like Facebook has and Bias and inequality arises as a result from negligence by both the businesses and their consumers so both sides need to be wary of biases when they appear and swiftly deal with the issue.



To further examine the issue of inequaliity in Facebook Ads, I interviewed two of my friends; one a male and the other a female. I first asked them to open up their Facebook apps on their phone and we just spent several minutes scrolling through their feed. We found a few things that were similar such as advertisements to coding bootcamps and jobs and Microsoft, but there were also some differences. For example, my male friend would see suggestions for jobs as a video game tester while my female friend would be shown job opportunities at nursing and teaching. I asked them what do they think of advertisments offered by Facebook and whether they were biased in delivery to both responded yes. We also discussed how it's hard to implement an algorithm is not biased since people are inherently biased in some way, shape, or form.


Bias may not be something that algorithms necessarily engineer, but they are definitely reinforcing and amplifying bias. As the age of digital information continues, it is important that we do not stand idly by when there are cases of bias that can be addressed. Businesses need to be mindful of their hiring criteria, Facebook needs to watch out for what ads they post and what users are getting it, and we as users need to be more aware and proactive with pointing out cases of inequality. By putting together all these points, inequality can be combatted and prevented more effectively and less bias will arise from content on the Internet.


Presentation: https://docs.google.com/presentation/d/1ONulHjN9hu9FGfFwGZBcl0nV5F_zqBrYkzNDK-vh1_k/edit?usp=sharing


Sources:

- Aslam, Salman. “Facebook by the Numbers: Stats, Demographics & Fun Facts.” Omnicore, 10 Feb. 2020, Facebook by the Numbers: Stats, Demographics & Fun Facts.

- “How Does Facebook Decide Which Ads to Show Me?: Facebook Help Center.” Facebook, www.facebook.com/help/562973647153813.

- Clement, J. “Facebook Active Advertisers 2019.” Statista, 30 Jan. 2020, www.statista.com/statistics/778191/active-facebook-advertisers/.

- Hao, Karen. “Facebook's Ad-Serving Algorithm Discriminates by Gender and Race.” MIT Technology Review, MIT Technology Review, 8 Apr. 2019, www.technologyreview.com/s/613274/facebook-algorithm-discriminates-ai-bias/.

- “'Harmful' Gender Stereotypes in Adverts Banned.” BBC News, BBC, 14 June 2019, www.bbc.com/news/business-48628678.

- Scheiber, Noam. “Facebook Accused of Allowing Bias Against Women in Job Ads.” The New York Times, The New York Times, 18 Sept. 2018, www.nytimes.com/2018/09/18/business/economy/facebook-job-ads.html.

26 views0 comments

Recent Posts

See All

Phase 2, Post #11

While technology has helped many places in the city urbanize, it can also cause problems for other places as well. In the presentation given by Michael Higgins, one notable and extremely familar examp

Phase 2, Post #10

During this presentation given by guest speaker Chris Woods from the NYU LGBTQ+ Center, we learned about the meaning of the acronym LGBTQ+ as well as explored more terms and definitions when referring

Phase 2, Post #9

In the presentation given, we examined the prison education program offered by NYU. During this talk, we examined the programs mission to give the incarcerated the abilities and skillset to hopefully

Post: Blog2_Post
bottom of page