Sleight of Hand and Data Laundering in Evidence Based Policy Making

I’ve still to make this year’s New Year’s Resolution, but one of the things that I thing I’d like to spend more time getting my head round is the notion of “evidence based policy making” (e.g. Is Evidence-Based Government Possible?.

As far as I can tell, this is often caricatured as either involving Googling around a policy area using ministerially obvious Google terms and referencing whatever’s in the top 5 hits, or taking a policy decision then looking for selective evidence to support that decision, along with contrary evidence against competing alternatives; (in a related area of evidence based practice, see for example: Some Questions about Evidence-based Practice in Education. If you have other examples in a similar vein, please let me know… #lookingForAnEvidenceBase Also e.g. the idea of policy based evidence making [h/t Jon Warbrick];-)

One of the suspicions I have is that “evidence” inherits the authority associated with the most reputable source associated with it when we wish to call on it to justify it, (and possibly as a complement to that, the least reputable source if we wish to discount it?)

So for example, in his Networker Observer column last weekend (With friends like Facebook…), John Naughton describes a presentation given to a technology conference by Facebook’s chief operating officer, Sheryl Sandberg, that pre-empted a European commission announcement on privacy:

Sandberg made claims about the economic benefits of privacy abuse that defy parody. For example, she unveiled a report that Facebook had commissioned from Deloitte, a consultancy firm, which estimated that Facebook – an outfit with a global workforce of about 3,000 – indirectly helped create 232,000 jobs in Europe in 2011 and enabled more than $32bn in revenues.

Inspection of the “report” confirms one’s suspicion that you couldn’t make this stuff up. Or, rather, only an international consulting firm could make it up. Interestingly, Deloitte itself appears to be ambivalent about it. “The information contained in the report”, it cautions, “has been obtained from Facebook Inc and third party sources that are clearly referenced in the appropriate sections of the report. Deloitte has neither sought to corroborate this information nor to review its overall reasonableness. Further, any results from the analysis contained in the report are reliant on the information available at the time of writing the report and should not be relied upon in subsequent periods.” (Emphasis added by JN.)

Accordingly, continues Deloitte, “no representation or warranty, express or implied, is given and no responsibility or liability is or will be accepted by or on behalf of Deloitte or by any of its partners, employees or agents or any other person as to the accuracy, completeness or correctness of the information contained in this document or any oral information made available and any such liability is expressly disclaimed”.

In this case, the Deloitte report [Measuring Facebook’s Economic Impact in Europe] was used as evidence by Facebook to demonstrate a particular economic benefit made possible by Facebook’s activities. The consultancy firms caveats were ignored, (including the fact that the data may in part at least have come from Facebook itself), in reporting this claim. So: this is data laundering, right? We have some dodgy evidence, about which we’re biased, so we give it to an “independent” consultant who re-reports it, albeit with caveats, that we can then report, minus the caveats. Lovely, clean evidence. Our lobbyists can then go to a lazy policy researcher and take this scrubbed evidence, referencing it as finding in the Deloitte report, so that it can make it’s way into a policy briefing. Or that’s how I imagine it, any way..

John’s take was in a similar vein:

The sole purpose of “reports” such as this is to impress or intimidate politicians and regulators, many of whom still seem unaware of the extent to which international consulting firms are used by corporations to lend an aura of empirical respectability to hogwash.

Quite so. ;-) I think my concerns go further though – not only is the Deloitte cachet used to bludgeon evidence-poor audiences into submission, it may also perniciously make it’s way into documents further up the policy development ladder where only the findings, and none of the caveats (including the dodgy provenance of the data) are disclosed.

So here are a couple of things for the data journalists to take away, maybe?

1) there may be stories to be told about the way other people have sourced and used their data. Where one report quotes data from another, treat it with as much suspicion as you would hearsay… Check with the source.

2) when developing your own data stories, keep really good tabs on where the data’s come from and be suspicious about it. If you can be, be open with republishing the data, or links to it.

PS if you have other examples of data provenance laundering, please add a link as a comment to this post:-)

PPS see also How SOPA and PIPA did and didn’t change how Washington lobbying works: “The political scientist E.E. Schattschneider once called politics “the mobilization of bias.” By this, he meant something both simple and profound. All political battles are fights between competing interests, he noted, but political outcomes are almost always determined by the bias of those paying attention to the conflict. The trick is to make sure you mobilize the crowd that will cheer for you.”

PPPS A bit of history relating to the “data laundry” idea, originally in the context of scrubbing rights tainted records from library catalogue metadata: https://blog.ouseful.info/2011/08/09/open-data-processes-the-open-metadata-laundry/

PPPPS via the Twitter Abused blog, the notion of “recursive abstraction” (“where datasets are summarized; those summaries are then further summarized and so on. The end result is a more compact summary that would have been difficult to accurately discern without the preceding steps of distillation.”) and a corollary in the sense of elements from a qualified infographic being republished in summary form without the original qualification (yet presumably with the need for even more qualification on top of the original disclaimers!)

Author: Tony Hirst

I'm a Senior Lecturer at The Open University, with an interest in #opendata policy and practice, as well as general web tinkering...

10 thoughts on “Sleight of Hand and Data Laundering in Evidence Based Policy Making”

  1. “­taking a pol­i­cy deci­sion then look­ing for selec­tive evi­dence to sup­port that deci­sion, along with con­trary evi­dence against com­pet­ing alter­na­tives;” This is of course “Policy-based evidence making”.

Comments are closed.

%d bloggers like this: