8 years, 3 months ago

Real Criticism, The Subject Supposed to Know

Link: http://feedproxy.google.com/~r/DemandingChange/~3/n81RFnewbS0/real-criticism-subject-supposed-to-know.html

Goodbye, Anecdotes“, says @Butterworthy, “The Age Of Big Data Demands Real Criticism” (AWL, January 2013). Thanks to @milouness, who comments “Important concepts here about what is knowable!”.  The article tries to link Big Data with Big Questions about the Big Picture, and what @Butterworthy calls The Big Criticism. From this perspective, Bill Franks’ advice, To Succeed with Big Data, Start Small (HBR Oct 2012), is downright paradoxical.

But why would we expect Big Data to help us answer the Big Questions? Big Data is rather a misnomer: it mostly comprises very large quantities of very small data and very weak signals. Retailers wade through Big Data in order to fine-tune their pricing strategies; pharma researchers wade through Big Data in order to find chemicals with a marginal advantage over some other chemicals; intelligence analysts wade through Big Data to detect terrorist plots. Doubtless these are useful and sometimes profitable exercises, but they are hardly giving us much of a Big Picture. Big Data may give us important clues about what the terrorists are up to, but it doesn’t tell us why.

A few years ago, Chris Anderson promoted The End of Theory, and published an article claiming that The Data Deluge Makes the Scientific Method Obsolete (Wired June 2008), although this may have only been an ironic reference to Fukuyama’s earlier idea of The End of History. Claiming obsolescence seems like hyperbole, although scientific method has always been modified by technological progress. Even in mathematics, computer power and human brilliance have combined to crack some previously unsolved problems. See for example, Proof and Beauty (The Economist, March 2005).

Although @Butterworthy claims to have identified some critical (“Big Critical”) questions, there seems to be only one real question – the dialectical question of quantity becoming quality. Are we on the cusp of aggregating utilitarianism into new tyrannies of scale? Are the numbers are so big, they leave interpretation behind and acquire their own agency? How much information and of what kind would you need to conclude something – for example, something like gender bias in the media?

A recent academic study looked at 2.4 million pages of newspaper and came to the conclusion that there was some gender bias. That’s a lot of newspaper. It’s like looking at every single grain of sand in the forest and testing it for ursine faeces. (In other words, looking for microscopic proof that bears defecate in the woods.) From a technophile perspective, Big Data seems to be raising the bar for scientific methodology: following this impressive piece of research, those who don’t understand the concept of statistical significance can dismiss any smaller study – for example, one that merely studied thousands of pages – as unscientific anecdote. At a stroke, decades of careful analysis by feminists can be discredited because their sample sizes were too small by modern Big Data standards, and so there is now less scientifically credible evidence of gender bias than there was before.

Seriously, how many pages of newspaper do you have to read to convince yourself of gender bias? Clearly this is an example of Big Data getting in the way of the Big Picture. @Butterworthy clearly understands this danger, and sees the redemptive possibility of Big Crit (whatever that is) revitalizing the notion of critical authority and restoring some balance to the universe. I’m not sure I follow how he thinks that is going to happen,