Apple’s ad on over-sharing proves that there is a privacy problem
We have a privacy problem.
Just a few weeks ago, Apple released its new iPhone ad titled “Over Sharing.”
It starts off with a man shouting to a bus full of people that he “browsed eight sites for divorce attorneys today,” and later escalates to a woman giving her login information to strangers at a movie theater. Then we see a man yelling out his heart rate while jogging and a woman shouting out her credit card details.
Finally, the ad ends with the reassurance that Apple is committed to protecting its user’s privacy. With an ending quote that reads: “Some things shouldn’t be shared. iPhone helps keep it that way.”
After publicly apologizing for allowing contractors to listen to the user’s commands to Siri without informing them, the company has hammered its dedication to protecting the privacy of its users. Tim Cook, the company’s CEO, repetitively called privacy a fundamental human right.
A clear privacy problem
Ever since Edward Snowden came out and exposed the truth about privacy, it’s safe to say nobody feels comfortable with their laptop cameras. And countless times have we been told that the government is monitoring every single of our moves.
Not gonna lie, sometimes it’s hard to imagine any CIA agent just sitting there judging my cooking skills while also solving international conflicts. “Don’t they have bigger issues to worry about? Isn’t the world on the verge of another world war?”
It seems irrational for anyone to be watching our mundane activities all day long while there are bigger life-threatening events yet to solve.
However, our willingness to share our information has given companies like Facebook and Google an open book to different human weaknesses. They have figured out the mind control game that we used to only see in movies. And now, the single biggest threat to humanity is the lack of privacy.
The lack of privacy issue
“If you are not paying for the product, then you are the product”.– The Social Dilema, Netflix
Tech companies today are now some of the most profitable companies of all time. And while many of these “users” pay don’t even pay for their services, they make money from selling space for ads. The “users” are just a by-product for their real clients: big corporations.
Their main goal is to guarantee spaces where these ads would be successful. The better they know the user, the better their predictions become, thus the more certainty they have over their service. The Social Dilemma documentary on Netflix puts it simply: “great predictions begin with one imperative: you need a lot of data.”
Data begins and ends with engagement and growth. The more users scroll, like, and interact with each other, the better the algorithm gets to understand their behavior.
They can predict what kind of videos would keep us watching, where we are going to go, and what actions are we taking. Ultimately, they can predict what kind of emotions are more likely to trigger our behavior. We clearly have a privacy problem. And it is scary.
Companies like Facebook, Google, and Amazon are essentially building business models that predict our actions. This is something that Shoshana Zuboff calls “surveillance capitalism,” where tech companies compete for the user’s attention.
The true Truman Show
“It is the gradual, slight, imperceptible change in your own behavior and perception that is the product.”– Jaron Lanier, The Social Dilemma
What this does is far more concerning than a computer understanding you better than you understand yourself. The job of the algorithm is to figure what to show you next to keep you on screen. To handle and control, in a skillful manner, the content users are exposed to in their everyday life.
In other words: manipulation.
In fact, manipulation is something that is very explicitly taught at many of the great technology schools. The Standform persuasive lab, for that matter, teaches how to build technology based on the psychology of what persuades people — persuasive technology.
They use what psychologists recognize as “positive intermediate reinforcement.” This means implanting unconscious habits on its users. Every time your finger scrolls down, you expect the page to refresh. This technology is intentionally designed to modify user behavior and to incentivize some action.
Over the years, Facebook, Google, Twitter, and more have mastered the art of manipulation, to the degree we are not even conscious about. There is more information recorded about human behavior than can possibly be imagined. Houston, we have a privacy problem.
Now, the question is why have these “tools” created for positive, become so negative?
When Facebook published their results on their famous Massive-Scale Contagion Experiment in scholarly journals, they emphasize two key findings. “One is that they know they can successfully manipulate subliminal cues in the online content to change real-world behavior or real-world emotion. Two, we can exercise this power or these methods while bypassing user awareness.”
Particularly during these times, the digital world has become our primary source of communication, community and belonging. It has proved its efficiency to keep things going, even while people are forced to stay at home.
Still, the polarizing realities of the world have never been more evident. Anti-maskers claim that COVID-19 is a hoax, all while at the same time the overwhelming numbers of infections and death keeps rising.
One thinks “don’t these people read the news?”
Cathy O’Neil, the author of Weapons of Math Destruction, said it better than anyone else: “Algorithms are opinions embedded in code.”
They are. That’s the problem. The human mind is vulnerable to persuasion. It was vulnerable in 1942 and it still is. Only this time it’s an algorithm with little to no human supervision that keeps feeding on our behavior. And we keep unconsciously complying with it.
In August, Apple announced that it created a privacy-focused software that curtailed Facebook’s ads for iPhone users. The company is elevating its efforts to address this concerning issue.
Now that the presidential elections are right around the corner, the urge to improve communication pathways and deliver legitimate news is critical. “The life of Americans depends on it,” as Michelle Obama said.
So, perhaps there could not be a better time for Apple to remind us all that “some things should not be shared.”