GDA via AP Images
If data is simply your recordable human behavior, what aspect of your life are you comfortable having surveyed and shared and sold to a network of brokers?
The Open Mind explores the world of ideas across politics, media, science, technology, and the arts. The American Prospect is re-publishing this conversation.
Karim Amer’s The Great Hack is a tour de force of the digital dystopia, law-breaking, and anti-democratic realities of the internet age. Amer explores how Cambridge Analytica exploited the demons of social media in the Brexit campaign and in the run-up to the 2016 U.S. presidential election. This is a discussion of the weaponization of the digital economy that preys on an increasingly vulnerable, misinformed, and disinformed globe.
Alexander Heffner: What is the latest on this scandal?
Karim Amer: Here’s the reality. Facebook has become a crime scene and Facebook needs to be held accountable and we need to be able to use the rule of law to do so. The $5 billion fine did nothing for Facebook. In fact, their stock price went up and they made $6 billion that week. So actually it was the market signaled that fine as, as quite as a nothing. We have to decide as a society what’s for sale and what’s not for sale and if the entire democratic process has become commoditized, what are we going to do about it? I think it’s time for a new social contract. I think that’s what the clamoring is about. But I think in 2019 or as we enter into 2020, that social contract is no longer between citizens and government. It’s between citizens, government, and tech platforms. And it comes in the name of a user agreement. So I think we should be looking for who the new authors of an equitable user agreement are and how that agreement could be something that reignites confidence and faith in our ability to use the most important phenomena of our lives, the internet, in a safe and clean way. And whether that means the internet needs to come with safety, with seat belts, maybe it does, but we have to start. We have to start having that conversation because our democracy is at stake.
Heffner: Negligence is the key word here because social media have abdicated their oversight prerogative. Are they not going to contract with third-party vendors who will potentially exploit stolen data in the way Cambridge Analytica did?
Amer: Facebook’s definitely taken some steps since 2016 to, to clean up their act a bit. However, we still don’t have basic knowledge from Facebook about what did or didn’t happen during Brexit and 2016, they still have refused to turn over the evidence about what ads were run, what ads weren’t, and who paid for it in full detail. So they, we need more transparency and we need Facebook to do better. I mean no one believes that Mark Zuckerberg and his team wake up and think, how are we going to wreck democracy today?
Heffner: Right.
Amer: The thing is that running ads based off of people’s psychology and personalizing ads is not actually inherently bad, right? These are all tools, technological tools and tools in their very nature aren’t good or evil, right? I think the problem is that this is happening in what I think is going to be called the Wild West of the data world because there’s just no parameters. That is because of an entire business model that is fueled by venture-backed Silicon Valley startups of everybody wanting to be the next great big data company. And so without having any kind of ethical limitations or regulations saying what people can or can’t do, and most importantly, you know, as you said, one word was negligence. Another word that we have to talk about is consent. I think the big problem with this conversation is consent. We don’t have a consensual relationship with the way in which our data is used and not used. And that’s something that we need to figure out. If data is simply your recordable human behavior, what aspect of your life are you comfortable having surveyed and having shared and having sold to a whole network of brokers that can then lead to information and profiles being created about you?
Heffner: One example of that is using Mozilla Firefox. As soon as you use Firefox instead of Safari or Chrome, you’re saying, I’m not going to let my browser make money off of my personal browsing history.
Amer: You can opt into nonprofit entities and that, and that is an important and noble thing to do. However, I would say that the challenge we are facing in this space of information warfare is at the size and level of something, like climate change, we just haven’t had the words for it.
Heffner: What’s next in terms of your pursuit of digital justice?
Amer: The Great Hack is the beginning of a conversation in that you’ve opened up this door that is clearly a space that a lot of people from around the world want to be talking about. And I think we are looking now at how we can continue that conversation. Behavior change isn’t implicitly a bad thing. We need advertising; we need marketing to effectively communicate. But I think that there’s a difference between persuasion and manipulation and we need to start figuring out where those boundaries and borders are in different aspects of our life. Are we entering into a world where algorithms that are amoral in nature are going to be determining, you know, who gets certain access to education? Who doesn’t? Are we living in a world where criminality scores determined by algorithms are going to be the ultimate arbiter of justice? Are we living in a world where, you know, our algorithm is predetermining your ability and your family’s ability to have access to certain financing and to not have access to certain choices in life? And if so, who’s determining the ethical boundaries of these algorithms and who’s watching the watchers?