Skip to main content
Social Sci LibreTexts

5.2.3: Conclusion and Recommendations

  • Page ID
    139202
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    Our report is one of the first multi-institutional investigations into college students’ awareness of and concerns about how algorithms are shaping the news and information they receive from internet giants, such as Google, Amazon, YouTube, Instagram, and Facebook. Qualitative data were collected from students participating in one of 16 focus groups, and 37 faculty telephone interviews, at eight U.S. colleges and universities. Together, these discussions examined how students were experiencing dramatic changes brought on by algorithms, while exploring the extent to which the issues of personalization, privacy protection, machine learning, and AI have entered the classroom.

    This research was conducted in the wake of coverage of the Cambridge Analytica scandal, one of the most-read news stories in 2018.77 Powerful algorithms had trawled through 87 million people’s Facebook profiles to sway voters during two hotly contested campaigns: support for Brexit in the U.K. and the presidential election of Donald Trump in the U.S. Since then, algorithms have clearly entered the public conversation, and students in our study, like so many around them, were frustrated with the opaque lines of coding trying to influence their interactions on popular websites.

    Most students recognized that as information has become ubiquitous, the hidden levers that personalize results and nudge us toward selected information often camouflage complexity behind the appearance of simplicity and efficiency. Moreover, many, though not all, were aware that data-driven algorithms, if unexamined and unchallenged, could threaten representative democracy and the cultivation of informed and engaged communities.

    Importantly, students were both resigned to and indignant about algorithmic-driven ads, news, and information. Yet, many found sites like Google, YouTube, Instagram, and Facebook too useful to abandon. Many seemed resigned to the powers of an unregulated media environment, but were willing to engage with the platforms to exert their agency and protect their privacy. Their concerns were often accompanied by a sense of impotence, and for some, nihilistic dread. While some students worried about the “creepiness” of algorithms that eavesdrop on their offline conversations to try to sell them a product, others had concerns about the real-life consequences of automated decision-making systems that reinforce societal inequalities.

    Faculty in our interviews often expressed frustration and powerlessness with ubiquitous algorithmic systems that affect society. They lamented the “loss of a common culture,” and compromised privacy without accountability to the public. Their response was largely to stick to a narrow set of information sources, like The New York Times or NPR, and avoid social media platforms altogether.

    References

    1. Maggie Adams, Ari Isaacman Bevacqua, and Anna Dubenko (19 December 2018), “The most-read New York Times stories of 2018,” Story #52: Matthew Rosenberg, Nicholas Confessore, and Carole Cadwalladr, “How Trump consultants exploited the Facebook data of millions,” (17 March 2018). This story about Cambridge Analytica ignited harsh criticism from lawmakers about Facebook’s business dealings and use of algorithms. https://www.nytimes.com/interactive/...p-stories.html. See also, Carol Cadwalldr (April 2019), “Facebook’s role in Brexit — and the threat to democracy,” TED Talk, https://www.ted.com/talks/carole_cad...t_to_democracy

    Contributors and Attributions

     


    This page titled 5.2.3: Conclusion and Recommendations is shared under a CC BY-NC-SA license and was authored, remixed, and/or curated by Alison J. Head, Barbara Fister, & Margy MacMillan.