April 4, 2018; Scientific American
A woman in one of our groups told the story of being offered what was billed as an “upgrade” from a basic cellphone to a smartphone; her reply was, “No, I don’t want anything smarter than I am.”
That response is properly understood as resistance through humor. As she stated, “I have all that I need.” Many participants in our study expressed doubt that it really could be a lifestyle “upgrade” to have a phone “listening” to her conversations or to have multiple companies tracking her location with their apps every minute of every day.
Another woman expressed dismay about another form of corporate monitoring, describing “when you’re shopping for a cutting board on Amazon … and then when I sign on Facebook, every single ad on the sidebar is for a cutting board. … That freaks me out a lot more than other things.”
Hidden inside the latest data hubris at Facebook and Cambridge Analytics is an old tradition of technology resistance. We are wondering whether that resistance will turn into more widespread action right around now, as Zuckerberg hits the Hill today to testify to his horror at being “duped” by the Russians into sharing lots of information about the real dupes (us). Cancelling Facebook accounts is just one of the most obvious forms of resistance to predictive analytics. But mass media may be missing an older tradition of data skepticism.
A fascinating study of “Appalachian attitudes” toward technology was reported in Scientific American recently. The story suggests that people were already aware of their skeptical relationship with electronic media before the Facebook fiasco. What kind of conscious resistance will it take to get us closer to a less manipulated and more democratic dialogue?
The participants who used smartphones, tablets or computers reported more than 50 safety strategies they use to stay safe online, such as refusing suspicious contacts, restricting the information that they posted online or avoiding public Wi-Fi. Far from being passive or ignorant, they made a lot of intentional choices about how to handle technology when they do choose to use it. Not all of this is Appalachian-specific; I’m not sure there is a culturally specific way to delete browser cookies.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
By signing up, you agree to our privacy policy and terms of use, and to receive messages from NPQ and our partners.
Clearly, this study is far from definitive; just 24 subjects from a single social background observed in focus groups where each participant learns from the others’ comments is scarcely scientific. Still, the findings suggest that pushing back against predictive analytics may be well ingrained in some humans’ cultural DNA.
When Isaac Asimov created the fictional field of “psychohistory,” the two features of the hypothetical science were that huge amounts of data were required, and subjects (ordinary citizens) needed to be ignorant of the findings. If the subjects were aware of the predictions, they would change their behavior. Asimov has been proven more prescient than his real-life followers in the business of predictive analytics. The study reported in Scientific American suggests that ordinary folks have figured out what their data is being used for and skewed the outcomes.
Of course, there are other problems with predictive analytics. Virginia Eubanks’ new book Automating Inequality catalogues a series of flaws in contemporary predictive analytics. In an interview with Slate, Eubanks says, “I argue in the book that we actually smuggle all of these political decisions, all of these political controversies, all of these moral assumptions, into those tools. Often, they actually keep us from engaging the deeper problems by overriding these deeper concerns. We often believe these digital tools are neutral and unbiased and objective. But just like people, technologies have bias built right into them.”
Common-sense data resistance may go beyond the tactics mentioned in the Scientific American article. Anyone who has watched television has heard the disclaimer, “Past Performance Is No Guarantee of Future Results.” Hello predictive analytics, meet chaos theory. Whoops, there goes your dataset.
Another flaw in predictive analytics that’s cited in both the Scientific American story and in Eubanks’ interview is the bias built into the field of data science. Nerds with degrees may not have experienced life in the worlds from which their data are extracted. Jazz legend Charlie Parker identified this flaw in data science when he said, “If you don’t live it, it won’t come out of your horn.” Without a thorough grounding in the social system being studied, conclusions will be erroneous.
Too much of what passes for knowledge in the age of Google is looking for patterns embedded in huge amounts of data. Seeking a quick-buck shortcut, some in data science have forgotten a basic principle of the scientific method: Science begins when researchers create and test hypotheses. In other words, “correlation does not imply causation.”—Spencer Wells