Industry analyst focused on data strategy, analytics and privacy argues we need to apply critical thinking and exercise caution as we enter the age of “data ubiquity”.
THE COMING TECH DISRUPTIONS and revolutions have at times been predicted to fix all manner of societal and environmental ills, so at first glance Susan Etlinger’s warning to exercise caution and restraint can seem odd. But while data may be propelling advances in many fields, caution and critical engagementwith the information we collect is essential if we are to avoid error and protect our privacy, Etlinger warns.
“At this point in our history, as we've heard many times over, we can process exabytes of data at lightning speed, and we have the potential to make bad decisions far more quickly, efficiently, and with far greater impact than we did in the past” she said in a 2014 Ted talk.
Applying these ideas to the world of business, Etlinger advises companies on using big data and is member of the Big Boulder Initiative, an industry organisation which promotes the successful and ethical use of social data.
We asked her about how cities and companies can safely and effectively handle information as we enter the age of data ubiquity.
Why is critical thinking so important to handling big data?
Here we are with more data than we know what to do with. And human beings have this tendency to give a lot of respect to technology. It’s funny, if you look at charts and graphs, if you look at studies that come out, people tend to trust charts and graphs quite a bit. What’s interesting is underneath that chart or that graph might be terrible data, and actually might be showing something that’s untrue. Or it might not account for something important.
In the age of nearing data ubiquity, are we keeping pace with our critical thinking and processing of this information?
It depends on the kind of data. So if you think about for example something like weather prediction has got so good. The set of data you need and the possible outcomes is more or less constrained. Then when you get to things we call human data - so human expression, text, speech, audio, any of that - interpreting meaning, and then even translating, and then interpreting meaning again, you get into some real challenges in terms of understanding what people actually mean.
For example, on Twitter you could see something like “Oh great, I dropped my phone and broke it.” And most natural language processing technologies will classify that as a positive statement. So things like sarcasm, things like where certain groups might use veiled language because they might be politically active under administrations that frown upon that. Even simple things like the language teenagers use, which changes all the time, can be missed.
Could you summarize some of the applications and ethical concerns regarding image recognition (the ability of computers to “read” and interpret images accurately) and emotional detection technology (which aims to analyse and interpret a person’s mood either through speech patterns, facial movement or other cues)?
Some of the potential uses are really interesting. So you could imagine that if you had a photographic record of a city over time you could understand a little bit about population patterns, you could understand where people live during different times, where people work, what commute patterns look like. You could understand sentiment and how whether people seem happier or sadder or more worried than they were before. You could look at things like interests, sports or purchasing patterns, what people eat, anything. The question is given that you can do that, should you do that?
I think there are some ways in which this technology can help us understand our history better, I think there are ways in which it can help us understand others better.
[Regarding emotion detection technology] there are tremendous applications for social good (...) Delivering food and medicine to people who are different to reach. All the way to stuff for the elderly, stuff for the disabled, but at the same time those same technologies can also be used for mass surveillance, for other political purposes. They can be used for scary reasons too.
It’s a lot of power we potentially have at our fingertips now. So I’m arguing that we need to take a breath. And not stop it, because innovation will happen no matter what we do. But to really think about the ways we incorporate it into our businesses and into our society to.
© 2016, Green Builder Media. All rights reserved. This article is the exclusive property of Green Builder Media. If you would like to reprint this content, you are free to extract a short excerpt (no more than 1/4th of the total article), as long as you 1. credit the author, and 2. include a live link back to the original post on our site. Please contact a member of our editorial staff if you need more information.