Skip to main content
Social Sci LibreTexts

2.18: Computer Science

  • Page ID
    153399
    • Susan Rahman, Prateek Sunder, and Dahmitra Jackson
    • CC ECHO

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Computer Science is the study of computation, algorithms, and technologies, as well as their impacts on society and applied uses. In an ever evolving world, Computer Science is one of the main theses of the STEM argument that technological innovation sets the basis for the rest of society. The use of technologies such as facial recognition technology, artificial intelligence, and automation, have potential for vast changes in society. However, it is not guaranteed that these changes will prove positive for society.

    In June 2020, introspection and reflection arose with fervor in the wake of the murder of George Floyd. This held true throughout academia, including the field of computer science. A cohort of Black computer scientists, including professors from the University of Michigan, wrote an open letter calling to combat and end racism within the discipline, in which they discussed the history of inequity, and asked for a role in establishing fairness and equal opportunity, as well as equal partnership and representation within the leadership in the field (McAlpine,2020). In a nation that was built on discrimination and white supremacy, no facet of the society is spared, and this includes Computer Science and the increasingly advanced technological innovations. Artificial intelligence and Big Data continue the cycle of inequality and specifically target historically marginalized people of color. While technology is praised as a tool of societal advancement, these same technologies marginalize and disrupt communities of color with racial profiling. The letter articulates multiple offenses in Computer Science education. From micro aggressions such as black students being harassed and followed while accessing research and computer labs, to being told they do not fit industry culture, students face challenges on the basis of their race in addition to all the other everyday challenges of higher education. (An Open Letter, 2020).

    Innovations such as AI (Artificial Intelligence) and Big Data are continuing to target historically marginalized communities. Algorithms systemically segregate decisions such as admissions, housing, hiring for jobs, and more. Facial recognition technology and machine learning label Black people as criminals, leading to automated racial profiling. There is a historic connotation of police and law enforcement labeling and stereotyping Black people as criminals and outsiders who do not belong. This has held true over the decades, even at universities. Many colleges have looked to automate their methods to determine who belongs on campus,and who does not and may be a criminal.

    The University of California, Los Angeles (UCLA), one of the first colleges to utilize facial recognition, decided to use the technology to identify people in the view of the cameras dispersed all throughout campus. The nonprofit organization, Fight for the Future, decided to mirror UCLA’s actions, and found harrowing results. The organization used Amazon facial recognition software Recognition, and compared over 400 members of the community,including faculty members and student athletes to mugshots from a large scale database. This test returned 58 false positive results, connecting students and faculty with people who were identified as criminals in the mugshot database. The most common misattribution occurred with people of color, and many times, two people with nothing in common beyond race were identified as the same person with “100% certainty” (Jones, 2020).

    Furthermore, Daniel Neill and Will Gorr, two researchers at Carnegie Mellon University,developed a software tool named CrimeScan. CrimeScan was designed to be a crime-predicting aid (Rieland, 2018). The pair asserted that crime can have a “gateway effect,” and that smaller scale crimes serve as a predictor for more serious crimes in the future. These programs also frequently rely on past police data. By using historical police data that is built on racial biases,further feedback loops are built that continue the same cycles of race based policing. Certain neighborhoods are deemed as “good” and “bad,” and this causes arrests and policing based on Artificial Intelligence to be even more at risk of bias, as they are based on police decisions, not actual crimes (Rieland, 2018).

    Another ethical query arises: if a tool raises more expectations for crime in a neighborhood, will this in turn increase police aggression in said neighborhood? And if this is the case, will this give more data that confirms the current AI bias? Several organizations, such as the Brennan Center and the ACLU, have also posed questions about flaws in the AI systems,and biases being programmed into the system.

    Timnit Gebru is a computer scientist who specializes in artificial intelligence, and the former co-lead of Google’s ethical AI team, until she reported being forced out. Gebru coauthored a revolutionary article that discussed that facial recognition was significantly less accurate at identifying women and people of color, which can lead to discrimination against them (Hao, 2020). Based on records including articles, tweets, and emails, Gebru’s article was met with disdain from her colleagues, specifically, Jeff Dean, the head of AI. According to Dean,the paper failed to meet the expectations set for publication.At least 1,400 Google staffers and 1,900 others have signed letters of protest, and several other prominent AI and ethics figures argued that Google forced her out because of her role in raising queries on the ethical application of artificial intelligence, and other forms of advanced technological innovation.

    The discriminatory aspects of technology exist in subsects beyond artificial intelligence,such as machine learning, hiring algorithms, and even healthcare. The organization, Black in Computing listed several actionable items to promote change and equality in the field. These items include, but are not limited to supporting underrepresented students, improving workplace and academic environments, and changing outdated policies and procedures (Action Item List, nd). Technology is ever evolving, yet it must also evolve in an equitable and just direction, or else society risks falling into a dystopian world like those found in science fiction.


    This page titled 2.18: Computer Science is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by Susan Rahman, Prateek Sunder, and Dahmitra Jackson (CC ECHO) .

    • Was this article helpful?