It’s not naïve to be mad at Facebook

At this point in the Great Facebook Manipulation Debate we’ve reached the point where we are no longer talking about the ethics of the study and are instead talking about how people are talking about it. Which is fine, but it’s amusing how things go all meta.

There’s been some pushback against the idea that anyone ought to be concerned. Data scientist Monica Rogati:

Of course, the emotional manipulation of advertising has been fertile ground for research, activism, and satire for decades. See also: “Weird Al” Yankovic, Saturday Night Live, John Carpenter, Negativland, etc., etc.

Likewise, there’s this intensely patronizing bit from an Internet Luminary:

Because, you know, if everyone does it then we shouldn’t be upset about it.

Tal Yarkoni asks, while defending Facebook

I hope that people who are concerned about Facebook “manipulating” user experience in support of research realize that Facebook is constantly manipulating its users’ experience. In fact, by definition, every single change Facebook makes to the site alters the user experience, since there simply isn’t any experience to be had on Facebook that isn’t entirely constructed by Facebook.

… which is as good an example of “missing the point” as any. It is inarguable (and somewhat academic) that the Facebook is an artificial construct and a manufactured experience.

What makes people uncomfortable is Facebook silently manipulating users, not just user experience, for their own purposes. This concern is pretty well-founded; from frictionless sharing to social ads, Facebook has tried to take the autonomy out of word-of-mouth marketing. There is absolutely no reason for anyone to give Facebook the benefit of the doubt.

We are not morons, and opposition to algorithmic manipulation is not naïveté. So, for everyone who is “confused” that people are “shocked” that Facebook manipulates the timeline, here’s the reason. The fear has always been that the algorithms that shape what we see will either misinterpret our interests, thus hiding important stuff from us, or that these algorithms censor the news we get to serve someone else’s ends. Facebook triggered both of those fears at once.

That Facebook was removing “positive” and/or “negative” posts from people’s timelines suggests that we missed something important; that Facebook did it apparently without empathy reminds us that Facebook is essentially farming our attention for advertisers.

This is why people are upset. And they probably have a right to be.

Leave a Reply