Big Data Ethics: Not the New Oil!

I have longed, and I have searched. Finally some wise words about the too easy metaphor of data being the new oil, which I involuntarily almost loathe more than the human and natural consequences of oil itself (that was an exaggeration, mind you).

As it tends to be, it is Data Artist in Residence at New York Times, Jer Thorp, that comes to the rescue:

“It’s exciting stuff for marketing types, and it’s an easy equation: big data equals big oil, equals big profits. It must be a helpful metaphor to frame something that is not very well understood; I’ve heard it over and over and over again in the last two years.

The comparison, at the level it’s usually made, is vapid. Information is the ultimate renewable resource. Any kind of data reserve that exists has not been lying in wait beneath the surface; data are being created, in vast quantities, every day. Finding value from data is much more a process of cultivation than it is one of extraction or refinement.

[…]

Finally, we need to change the way that we collectively think about data, so that it is not a new oil, but instead a new kind of resource entirely. For this to occur we need to foster a deep understanding of data in society. As it happens, humanity has a mechanism for this kind of broad cultural change: the arts. As we proceed towards profit and progress with data, let us encourage artists, novelists, performers and poets to take an active role in the conversation. In doing so we may avoid some of the mistakes that we made with the old oil.”
Jer Thorp on the Harvard Business Review blog.

It is, of course, not something completely new that you have to think about the ethics of data. I believe it was something we learned in our first semester sociology class, but it is nonetheless of immense importance to highlight that it is growing into a bigger and bigger problem as more and more private companies own more and more personal data.

The consequences are way more far-reaching than one researcher disclosing personal data about 10-100 people. We’re now talking sensitive data about millions of people in the hands of for-profit companies who are often also a part of the bigger data marketplace where gigantic sets of data can be bought, merged, and consequently reveal stuff about us that we had never intended. Or simply be hacked by less ethical people.

What I would like to see in this space, is more of what UN Global Pulse are doing. They try to bridge the data available in academia, the UN system and private enterprises always with privacy and the global good in mind. That is just about the most beautiful combination of values I have ever heard of.

When doing good with big data, there are all sorts of pitfalls where you can suddenly do bad. Knowing that is a risk is the first step, and taking the step further and state that privacy is a human right, as do UN Global Pulse, we have the ideology in place to do amazing stuff.

“Any initiative in the field ought to fully recognise the salience of the privacy issues and the importance of handling data in ways that ensure that privacy is not compromised. These concerns must nurture and shape on-going debates around data privacy in the digital age in a constructive manner in order to devise strong principles and strict rules—backed by adequate tools and systems—to ensure “privacy-preserving analysis.””

UN Global Pulse in the white paper below

Using big data for development and global good is something entirely different from using it for profit, but the underlying premises are the same. Data must be used ethically, and privacy must always come first. No matter how much money or how big improvements of your service are at stake. Both points are highly relevant for academia, global development agencies and private companies alike. And the frustrations are very similar. The rules must be the same.

Scroll to Top