One of the planet’s greatest threats is occurring right under our nose — and most of us don’t even care. In fact, we’re helping the threat along. That’s the unsettling argument made by The Great Hack, a new Netflix documentary that uses as its jumping-off point the 2018 revelation that Cambridge Analytica, a British political consulting firm, had harvested Facebook users’ data without their permission, all in the service of delivering pointed (and often misleading) ads in the lead-up to the 2016 presidential election and Brexit referendum. (Both Ted Cruz and Donald Trump’s campaigns hired Cambridge to gather what’s known as “psychological data,” which gives candidates a sense of the psychological makeup of potential voters through their online usage and responses to personality surveys.)
Directors Karim Amer and Jehane Noujaim chronicle the scandal from several perspectives — including professor David Carroll’s frustrated attempt to sue Cambridge to obtain his information — but they’re mostly worried about the story’s overriding implications. “Our humanity is at stake,” Carroll declares at one point in The Great Hack, and the film’s jittery, anguished rhythms do nothing to mitigate his urgency.
This subject matter is somewhat fraught for the filmmakers — especially Noujaim, who codirected Startup.com, the 2001 documentary about two friends who wanted to capitalize on the dot-com bubble of the 1990s, dreaming up a website that would help users navigate government bureaucracy by, say, paying a parking ticket more efficiently. The world wide web seemed like a far more optimistic place back then — at last, disparate peoples across the world could connect and share ideas — and ever since, she’s shown the power of technology to educate and spur change in her subsequent documentaries Control Room and The Square, which earned an Oscar nomination. (Amer produced The Square.) But The Great Hack looks at the dark side of the internet’s innovation, casting viewers as potential victims to social media platforms and other online instruments that collect our personal information and then use it in ways we can’t imagine.
To that end, The Great Hack has no problem serving as a virtual town crier begging us to think more deeply about what we surrender by downloading apps to our phone or filling out Facebook questionnaires. In a recent phone conversation, Amer and Noujaim echo the alarmist tone that’s endemic in their documentary. For them, data collection is as terrifying as global warming — to their mind, the fate of democracy itself hangs in the balance. “I feel like this film is a warning cry,” Noujaim says at one point, later adding, “It’s a call to be aware.”
And therein lies the problem. To some extent, we’ve all pretty much already decided that we’re okay with our information being all over the internet. So can The Great Hack shock audiences into reconsidering their privacy rights? During our chat, Amer and Noujaim talked about why the Cambridge Analytica scandal is about more than elections, what it was like for Steve Bannon to be a fan of their work and why they haven’t deleted their own Facebook accounts yet.
The Great Hack greatly disturbed me. Then, about an hour later, I found myself checking Facebook. I guess I’m part of the problem — I can’t shut off.
Noujaim: I do believe that we’re all complicit, including myself and everybody that has a Facebook account and is on social media. But I also feel like it’s a false choice. I don’t think we’re calling for everybody to give up connecting with their friends and being a part of the connected world. I don’t think that the choice has to be that the price of entry is to give up all of your privacy.
I haven’t given up Facebook. I’ve posted about this movie on Facebook. Part of me feels guilty for [being on Facebook], and the other part of me feels that we should realize that we’re all complicit. At the same time, we need to hold these tech platforms accountable. Since such an overwhelming majority of the population is on them, [these platforms] have to start functioning more like a public service and keep ethics in mind to preserve our humanity rather than simply following the algorithm — a sort of amoral algorithm whose goal is to keep us on the platform for as long as possible and to get us to buy as much stuff as possible.
It seems that Facebook’s smartest strategy was to convince us that it was a safe, friendly place — it’s where we go to reconnect with pals and loved ones. We don’t suspect anything insidious, like targeted and misleading political ads, might be awaiting us.
Noujaim: That’s part of the problem — it takes us by surprise. The biggest problem is this issue of transparency. If you’re signing up for a service, paying for it and giving away data — and you’re aware of the targeting that’s happening, and of the data that you’re selling — then I don’t see a problem there. But this is all happening in a non-transparent way on these platforms where we share baby pictures.
We’ve experienced such fantastic things through what the tech world has brought us that we don’t expect that the pendulum could swing in the other direction. Karim and I made a film called The Square, which was about the Egyptian Revolution, where we saw firsthand Twitter and Facebook being used as important tools for change, for holding government accountable, for connecting people in various difficult circumstances. When I was arrested, the police attempted to try to disappear people. It’s like a dissuading tactic: If they make it as difficult as possible for loved ones to find you, it makes you think twice, three times, about going down to protest. So, anyway, I was arrested, but one of the lawyers put out my picture on Twitter, and within 20 minutes, another lawyer who looked into that [specific] police station was able to find me.
Amer: We found Jehane through Twitter. That’s literally how we found her and got her out of jail.
Noujaim: So, in a very personal way, we found that these were such important tools for change. Silicon Valley, at that point, took credit for helping the Arab Spring find its voice and hold Cairo accountable. But then we started to see those very same tools being used to target, to influence, to manipulate. So we felt like we needed to start working on another film.
The idea of “my personal data” is both so big and so vague. It’s hard to visualize what that entails.
Amer: We’re starting to see that our data is more important than perhaps we assumed. But we don’t really see the data — we don’t really see how it shapes us, yet we know we’re interacting with it daily.
That’s why it took us so long to figure out how to make this movie — it’s similar to the environmental movement where, yes, there are people that are feeling the anxiety of the effects of industrialization on the planet, but the imagery was lacking. And it took imagery — like oil spills, for example — so you could see the wreckage on marine life. Those images could travel around the world, and people could understand that something fundamentally wrong was happening. With this [data-collection] problem, though, we’ve had all these big data leaks, but because we don’t see them, there’s no imagery — and so, there’s a deficit of language.
So we had to figure out how to make it a film. We wanted to show surveillance capitalism in action — the bedrock of this entire story. [Through animation], we brought to life the algorithm’s POV. We thought by showing people how the algorithm sees them, we could [illustrate] that, as moral creatures, we’re increasingly being run by an amoral algorithm.
I remember when Obama won in 2008 — part of the story of that election was his ability to utilize the internet to mobilize voters. Whether it’s him or Trump, we have to accept that online political ads are going to be our new reality, don’t we?
Noujaim: That’s a reality that we’re going to face in the future. With the success of this kind of advertising and targeting, [campaigns] are going to see this as an important tool. So the best defense is understanding how we’re being targeted and putting in certain kinds of guardrails. For example, when you see a [political] advertisement on TV, you hear the candidates say, “This is an ad that I approve of.” But you don’t have any such thing online right now.
In the U.K., if a company has your data — if it’s a U.K. company or does business in Europe — they’re required to hand over that data. But it’s also about a larger recognition and a conversation that needs to happen. We sign away all of our rights when we sign the user agreement — which we never read because we just want that app and we want it quickly. We would never do that with a paper contract. We need to have a serious conversation [about] us as the new commodity.
Amer: The thing we have to realize is that this is bigger than elections. We’re all persuadable. And the proof that we’re persuadable is that a lot of us are saying now that our phones are listening in on us. Or better put, our phones are not listening in on us — what’s happening is that there’s basically a voodoo doll of you that’s being fed all of your digital footprints in real time across so many aspects of your life. You may have different layers of consent that you feel you’re open to or less open to, but the algorithm strips it and takes what it wants.
The British parliamentary inquiry, when they finished their report [on the Cambridge Analytica data-harvesting revelations], came out calling Facebook “digital gangsters.” They came out and said that the electoral system in the U.K. is fundamentally unfit for purpose and no longer can work. It’s about whether we can have a free election ever again. And beyond that, the biggest thing, for me at least, is we’re seeing how fragile our governments are in face of monopolistic tech platforms. Facebook doubles down and announces its own currency while it’s being investigated. Facebook gets hit with a $5 billion fine, and the stock price goes up that day. This shows how the balance of power is set up.
But we’re also seeing the fragility of our society when it’s based on these “shared values” [but] everybody’s living in their own personal realities. Perhaps the thing that’s scariest to us is seeing the fragility of truth, when information has been weaponized. I think why we’re ringing that alarm bell is, in Egypt, we’ve seen the fragility of society and that democracy isn’t a God-given right — societies can turn fascist quickly. For America and the U.K. to believe that they’re above that is scary — this society is fragile.
Yet Silicon Valley, which wouldn’t exist without an open society — which wouldn’t have engineers migrate there as a refuge for building the future without the ideals of the open society — feels no responsibility whatsoever to protect it. Why? Because these societal losses don’t show up on their balance sheet. They show up on our societal balance sheet, and no one has figured out how to reconcile that.
The recent extreme online blowback to FaceApp suggests that we’re at least somewhat more cognizant of what’s going on with these apps.
Noujaim: Yeah, there’s a growing awareness. When we started this film, no one knew who Cambridge Analytica was, and it was much more of a difficult subject to speak about. I think the subject is going to become easier to talk about when this film comes out — and as more people write about the subject, and more films come out about the subject. You saw the images of the racial cleansing [in Myanmar] that were helped along by Facebook. You see a number of wreckage sites around the world that are quite alarming.
But you have to get people to care first, don’t you? There’s still this disconnect between what we click on and what gets captured from our personal data.
Amer: Carole [Cadwalladr, a reporter for The Guardian who helped break the Cambridge Analytica story] talks about meeting [Leave.EU backer] Andy Wigmore, and he’s like, “You know, Carole, we’re doing all this stuff [with] data and AI, and you’d be surprised — people just give you all their shit.” I think that’s kind of the point. We just give it away, just like what you saw with [FaceApp]. Everybody was like, “Oh my gosh, super-cool!”
A word that can be used more effectively in this context is “consent.” “Consent” has never been more defined and redefined in these times, yet we have no consent when it comes to data. We’ve enabled a world where technology platforms design things that are quite misleading. Like Candy Crush — everyone thought it was a really cool game, meanwhile it’s a massive data-collection operation. We have no idea what we’re participating in.
I don’t think that that’s going to happen without regulation. This idea that they’re just going to self-correct goes against the entire tradition of American regulation and the entire tradition of monopolistic entities and antitrust law.
Noujaim: I felt very similarly making this film to when I made Control Room. Watching the lead-up to the Iraq War, there was a completely different narrative that was being shared depending on whether you were watching Al Jazeera, Fox or CNN. It felt like you had people with this completely different understanding of reality based on what they were watching — how could they even attempt to communicate with each other if their basis for reality was different?
But in this situation where we are now, it was like Control Room on steroids. People in the same household couldn’t speak with each other because they’re being fed their own personalized reality on their newsfeed. For me, that’s one of the scariest parts of this [story], because we rely on being able to have healthy debate in order to have a healthy, functional democracy.
It’s ironic that Steve Bannon tried to buy Control Room. He was excited by those ideas back then. When I sold Control Room, he was a producer within the film industry. He’s been thinking about these ideas for a long time. He was running Wellspring at the time — Wellspring and Magnolia bid on the film, and we went with Magnolia. But I could’ve very easily had Steve Bannon as my executive.
Amer: He definitely sees the power of data. In the clip that we use in the film [of him], he goes on to say, “Data is the new land. We’re all serfs that have surrendered our freedom to the digital overlords.” He sees the vulnerability, but he wants to exploit it for a very different purpose — to fuel nationalism, which is very concerning.
Jehane, it’s been about 20 years since you made Startup.com. I remember that era, which felt a lot more idealistic about what the web could be. Do you feel more disheartened about the internet’s potential now than maybe you did back then?
Noujaim: I’d say that, for me, it has felt more like a rollercoaster. With Startup.com, there were these very idealistic views, but at the same time, [there was] the financial incentive with everybody wanting to participate in the first round of IPOs. But as you describe it, there was a little magic and idealism about what the internet would be able to do — making life easier, connecting with people in a way that never had seemed possible. That dream was present for me as well when we made The Square, because we did see [the internet] being a very powerful tool for change.
So the situation now… is it completely disheartening? I feel like this film is a warning cry that we need to be paying attention. If we’re not thinking for ourselves, then we’ll have algorithms and large tech companies thinking for us. It’s a call to be aware, but at the same time, we’re not denying the fact that there are incredible advances that are going to be made in medicine through big data-collection.
Basically, our data, as explained in the movie, is more valuable than oil. And it’s very personal. We need to have knowledge of how it’s being used. We need to have some kind of ownership — and some ability to have consent over how it’s used — because it affects all of our decision-making. It affects our future — not only of democracy but whether we have free will.
I do think that we’re living in scary times, and our hope is that this is the beginning of a conversation. But in no way do we think that everything that’s happening in the world of technology is negative.
Amer: After all, we both still have our Facebook accounts.