Lifestyle

I Watched The Social Dilemma — Here's My Takeaway

by Kristen Mae
Updated: 
Originally Published: 
I Watched The Social Dilemma — Here's My Takeaway
Exposure Labs/Netflix

I’ve worked in social media since the early stages of the Facebook algorithm. Back when the algorithm was still in its infancy and Facebook wasn’t yet totally overrun with ads, I didn’t hate that Facebook used one. It seemed logical that with such a massive amount of content, Facebook needed some way to prioritize and organize what people saw in their feeds. It was reasonable to me that they would highlight popular content and show that first.

Fast forward eight years and many, many evolutions of the algorithm, and we now have a finely tuned formula that not only factors in how popular a piece of content is, it also remembers and tracks the kinds of content that attracts your attention specifically. And once it has decided you like a particular theme, the algorithm shows you more and more and more of it. Especially if content on said theme is being promoted, i.e., Facebook is getting paid to show it to you.

So I thought I understood clearly enough how social media algorithms work. Then I watched the documentary The Social Dilemma on Netflix. Turns out, I was missing a critical piece of what the latest iterations of social media algorithms do: They don’t just figure out what you like and give you more of it. They intentionally try to manipulate what you like.

Social media algorithms no longer simply track and predict our behavior: they try to change it. They try to change us. On purpose. And they are succeeding.

The behavior that social media platforms want—the product they’re selling to advertisers—is our attention. Our sustained attention. They need as big of a supply as they can possibly gather so that they can sell it to the highest bidding advertiser. They need data that they can stick in front of an advertiser and say, “See how much consistent engagement we have? See how our users NEVER log off?”

If users (us) aren’t volunteering enough undivided attention, the algorithms present us with different, harder-to-ignore content that makes it more and more difficult to put down our device.

Exposure Labs/Netflix

Granted, The Social Dilemma had a few eye-roll worthy moments with its over-the-top after school special family that rapidly deteriorates due to unfettered use of social media. Still, the documentary paints a compelling picture of how algorithms prey on our human proclivity to react to the fear-mongering, the unbelievable, the outrageous. You know that saying about how you can’t look away from a trainwreck? When it comes to social media, we show up for the human connection and stick around for the trainwreck of fear and outrage.

Not that we don’t have legitimate grievances about which to be outraged. Our current sitting president is a hot mess of unsubstantiated 3:00 a.m. Twitter rants with the apparent singular aim of pitting Americans against one another. Systemic racism is a reality many still deny. The pandemic that could have been squelched in March continues to rage on, killing nearly a thousand Americans per day because Trump lied to us about it, and now he’s also lying about lying about it.

There is plenty of justified outrage to go around. We’d be outraged even without social media, though it’s doubtful Trump would be president if it weren’t for social media.

That’s where my biggest takeaway from The Social Dilemma comes in:

Just a couple of weeks ago, several tweets with similar wording went viral sharing an utter falsehood that 39 sex-trafficked children had been rescued from a trailer in Georgia and the news wasn’t covering it. Just one of the tweets, easily debunked with a quick Google search, was retweeted 158,000 times.

One hundred and fifty-eight THOUSAND, folks. This blows my mind. Oh, and fun fact: Twitter doesn’t have a way to report fake news. What the actual fuck, Jack Dorsey. Get it together.

Meanwhile, tweets about the real, far less outrageous story, get tweeted maybe a thousand times.

Humans love consuming and commenting on and sharing fake news.

Americans consider free speech to be a sacred right. Some of the world’s greatest thinkers have claimed that there should be no limitation on speech whatsoever because truth always prevails. In the past, this may have been true. But currently, with the help of social media algorithms that feed the voracious appetites of our basest human impulses, falsehoods spread many, many times faster than truth. And it is changing our social landscape in a real and tangible way.

This point worries me more than anything else that was said in The Social Dilemma. Social media algorithms addict people to misinformation and rage. They’re feeding people so much garbage that they can no longer tell the difference between fact and fiction. Free speech that causes this kind of harm, that changes people’s very realities, is not something we should be okay with. Is there really anything more important than truth? Are we really willing to sacrifice truth for free speech? Isn’t that a paradox? How can freedom exist without truth?

What is happening right now across social media platforms is an example of capitalism gone horribly wrong. This is not a problem that individuals will solve themselves. This is not a problem that will be corrected by market forces. Unless by “corrected” you mean, society as we know it will collapse and we’ll start from scratch and hopefully do better next time.

Despite all of this, my takeaway from The Social Dilemma, believe it or not, is not that we should delete our social media accounts. In fact, I think the people who take this from the documentary are missing the point. Yes, we should endeavor to be savvy users; yes, we should intentionally limit the time we spend on our devices, turn off notifications, consider deleting some apps from our phones so we only check our accounts via less always-in-our-hands devices.

Exposure Labs/Netflix

But the absolute most important thing we can do to fix the social media dilemma is pressure our elected officials to regulate this industry. It is absolute magical thinking to believe that deleting your social media accounts will make one shred of difference in the grand scheme of things. Friend, if you are a person who has enough willpower to delete social media, you are not the problem. Your singular level-headed presence simply will not be missed.

We must mandate that social media platforms build in mechanisms to identify and halt the spread of fake news. One of the speakers in the documentary stated that it isn’t possible to create artificial intelligence that can identify fake news. I call bullshit on that. Fake news spreads at least six times faster than real news, regularly uses emotionally charged language and punctuation, and elicits out-sized outraged reactions from those who love to share it. It’s not hard to spot. There’s got to be a teenager out there who can write this code.

Social media platforms should also be required to limit ad content. They should be mandated to maintain a certain ratio of “friends and family” versus promoted content in every user’s feed. Platforms could devise a system that rewards and adds weight to news platforms that regularly demonstrate integrity and veracity in reporting, and punishes platforms that regularly spread news that is later proven false. And when news is proven false, corrections should be published to everyone’s feed.

The leadership of these platforms will say they can’t do this, but, again, I call bullshit. They don’t want to do this because it would mean a massive investment and their wealthy stockholders will be pissed. But don’t let them tell you they “can’t.” Bullshit. Like I said before: capitalism cannot solve this.

Everyone should watch The Social Dilemma. Even if you don’t agree with every point made in the documentary, at the very least, it will give you some things to think about. One thing we can all agree on when it comes to social media is that we can’t continue down our current path. One way or another, something somewhere is going to give. Whether or not we commit to taking responsibility for how that “give” will look remains to be seen.

This article was originally published on