By Noraya Razzaque, M.A. Candidate, International Education Management & Conflict Resolution

In the second digital detox about practicing self-care in the attention economy, Dr. Sarah Lohnes Watulak talks about self-care in response to clickbait and the bombardment of information vying for our attention. How often do you click one interesting article or video only to find five more similar pieces of content automatically generated and demanding your attention? Next thing you know, you’ve read 5 different articles and watched 10 videos without even trying. And the next time you log on to that website, you are shown recommended articles or videos based on what you have previously viewed as a way to regain your attention. Sometimes it’s fascinating and the stream of information feels like a fun adventure, but more often than not, it can bring us into contact with increasingly troubling information (see Zeynep Tufekci’s investigation of YouTube’s radicalizing recommendation engine).

Each time social media platforms redesign their user interface, they optimize to grab our attention and, more importantly to them, our data. For instance, autoplaying videos have become the trend across social media and other platforms with videos playing without our hitting “play” or “next.” In an article for Facebook for Business, businesses get tips about best practices for marketing their business on Facebook to potential customers, including how to optimize consumer attention through videos utilizing automation features. The article emphasizes the value of pitching the marketing message within ten seconds or less and offers automated captions for video ads to ensure consumer attention, because, according to research “when feed-based mobile video ads play loudly when people aren’t expecting it, 80% react negatively, both toward the platform and the advertiser.” Such automated features are marketed to internet consumers as a convenience–you don’t even have to think about what you view or read next. The message is: sit back, relax, and be entertained in a continuous stream.

The attention economy profits off the valuable data we give as it seeks to keep our attention. Our interests, likes, dislikes, purchasing history, friendships, location, communications, behavior patterns and much more are part of the data exhaust collected while these technologies have our attention. Project Information Literacy (PIL) explains that through “a rise of the ‘attention economy’ or ‘surveillance capitalism’: profitable industries gather ‘data exhaust’ from our interaction with computers to personalize results, predict and drive behavior, target advertising, political persuasion, and social behavior at a large scale” (Information Literacy in the Age of Algorithms, p. 7). All of this data collection might seem benign, or even be sold as a benefit–a convenience–but our data is being used, shared, and sold in ways that are not transparent to us (see Bruce Schneier’s Data and Goliath). Further, our data is being plugged into algorithms that automate decisions in our lives–everything from what loan packages will be offered to us on a bank’s website, to what schools our children will get into, to what jobs we’ll be offered, to how our work productivity will be measured, and much more (see Cathy O’Neil’s Weapons of Math Destruction).

In the context of higher education, one way that automation can impact student learning and the shaping of students’ knowledge and worldview occurs during the information seeking process, by promoting similar content in search results. PIL’s review of research shows that despite the vast changes in the information landscape, on the whole, students rely on familiar methods for finding information for class assignments, usually starting with Google (p. 9). Relying on search engines’ algorithmically-suggested content can result in dangerous echo chambers in which one’s dominant ideas and theories are reinforced, potentially detracting from diversifying student learning and understanding differences of opinions and ideas. Beyond the known issues of relying on Google for information, PIL points to a still-emerging area of inquiry around educational technology platforms with which students engage on a regular basis as an (often) required part of their daily lives as  students. Platforms that promote personalized learning and advising software, for example, employ algorithms, but we know less about their impacts on students. PIL asks us to reflect on the question, “What does ‘algorithmic justice’ mean to a new generation of students affected by these systems but perhaps unaware that they are at work — even in their daily interactions with campus learning management systems, such as Canvas, online textbooks and advising and retention software?” (p. 8).

As with all new features of technology, it is crucial to consider the long-term implications of what it means to rely on automation. It would seem that this point is not entirely lost on some platform designers. In The State of Ethics in Design, Facebook Product Designer Maheen Sohail writes that “the attention economy serves as a great example of how we are no longer mindful of our user’s time” and says “it’s important for designers to take a step back and question the existing design patterns we’ve grown accustomed to….autoplaying videos might seem like a great way to increase product engagement…[but] we need to take a step back and think about what we’re asking of our users. By encouraging behaviours like binge-watching, is this deteriorating the overall health of a user?”

At the same time, we cannot wait for the platforms to move in the direction of algorithmic justice. So, how can we disentangle from the algorithms that seek to keep our attention? From the user’s end, consider what it means to be dependent on automated features- why are they useful? Be mindful of what is suggested to you and ask yourself- why am I being shown this? When are automated features most useful? Are they a necessity? Are the tradeoffs worth it? In the age of automation and the incessant demands of the attention economy, we need to remind ourselves to be conscious about what we are giving our attention to and how it is being received.

Take Action

  • Search for ways to turn off “automatic features” on your social media, apps, and other platforms that utilize automation for videos or other similar things. Here are instructions on how to do this on Facebook. Turning off these features can empower you to resist the attention these automatic features demand.
  • Don’t click everything that is suggested to you. Resist the urge to read or watch content that is automatically suggested to you based on your click history.
  • Protect your data privacy to keep companies from tracking your digital footprint and feeding algorithms. Use search engines like DuckDuckGo instead of Google to keep websites from tracking your data. Explore more privacy tools and resources listed here and from the DLINQ CryptoParty.
  • Unplug home assistant technologies, like Alexa or Google Assistant, when not in use, and consider leaving them unplugged most of the time. These are data-extracting devices (even when you don’t realize they are collecting data, they are).
  • Play with tools that confuse automation and algorithms, like the Track This tool. Track This obfuscates your browser data by opening websites unrelated to your browser history.

Keep reading!

 

Nick Hillier