Facebook can control elections!
Facebook can hop you up -- or bum you out!
There is something breathless about the reactions of some civil libertarians to the Facebook experiment that manipulated users' news feeds to determine the extent of something called emotional contagion. For example, Katy Waldman wrote indignantly in Slate that the research -- which selected some users for more negative news feeds than others -- made about 155,000 people sad for a day.
But then the lawyers stepped in to calm everyone down by suggesting that this study didn't actually manipulate Facebook users in a way that constitutes an "intervention" under the law. So that was a relief.
And along came the sociologists to say, "Who even cares what Facebook did?" We're bombarded with commercial and manipulative messages all day long anyway.
As the dust settles, though, there are aspects of Facebook's research that are disturbing and should be debated publicly. First, for example, it was secret, and it is almost certainly not the only social experiment taking place at the corporation. Second, when it was announced, Facebook spokespeople were so obtuse that no one there with authority realized that research on human beings without their consent is a problem, even if it doesn't meet the legal definition of an intervention. Third, the study presumed that personal data about Facebook users belong to Facebook, when most users think it belongs to them.
Regarding the first concern, the secrecy is a problem because it was only a year ago that we learned Facebook servers link to U.S. national intelligence agencies through the PRISM program revealed by whistleblower Edward Snowden. While Facebook asserted that the corporation had no knowledge of the access, the National Security Agency (NSA) claimed it did. Someone isn't being straight about this. Chances are, given its record, that the culprit is the NSA, but knowing what we know now, it could be Facebook.
Secondly, the spokespeople for the corporation don't seem to detect the general angst about undisclosed research on humans for any purpose without their informed consent. The concern is especially acute when the corporation cooperates with a secretive, overreaching arm of the government linked to law enforcement.
Facebook's principal researcher Adam Kramer (aka Danger Muffin) didn't do himself any favors with his explanation of his study. It was insensitive and tried to minimize the impact of what he did, beginning with the flip phrase "Okay, so." In his own Facebook picture Kramer looks like an electrocuted imbecile in cheap sunglasses, which doesn't help either. To explain himself further, Kramer wrote...
Regarding methodology, our research sought to investigate ... by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were "hidden," they just didn't show up on some loads of Feed.
To finesse the offensiveness of manipulating a human interaction for purposes of secret experimentation, social researchers often go to absurd lengths to appear scientific, which is imperious and meaningless. This is exactly what Kramer and his team did in their publication:
Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) ... word counting system, which correlates with self-reported and physiological measures of well-being, and has been used in prior research on emotional expression. LIWC was adapted to run on the Hadoop Map/Reduce system ... and blah, blah, blah...
The problems of secrecy and imperceptive, dull-witted official reactions are compounded by the presumptions underlying the study. If we want to articulate exactly what's wrong with this, we would say that it has something to do with theft. Facebook took information about its customers and used it for reasons other than those intended by them to create something of value that does not belong to them -- without telling them. And maybe Facebook does this in order to target ads, too, but then maybe the corporation shouldn't be doing that either.
Think of it this way. If you give your credit card information to a clerk in order to buy something, and the clerk notes the information and uses it to buy something for herself later without your knowledge, she's stealing from you. How is what Facebook did (and probably does) different from that?
According to Eileen Brown, a self-described social media consultant, the Facebook experiment is different because your data belong to Facebook and not to you.
Facebook has been accused of manipulating emotions. Well, get over it Facebook users. If you are a Facebook user, you willingly give Facebook every bit of data it has about you.
That data, as soon as you press submit, is data that Facebook can do with whatever it wants to. Whilst you might not have explicitly agreed to this at the time you signed up for the service, the agreement to use your data for research is now there in its terms of use.
Brown apparently decided this on her own, though. After all, Facebook itself wasn't so sure. That's why the corporation added to its terms of use the word "research" four months after the study took place. The terms of use Facebookees agreed to in September 2011 said nothing about the corporation's right to use their information for research. But in May 2012, the terms were altered so that they did. From this we can conclude that at least one lawyer in the corporate labyrinth foresaw a problem.
Moreover, whatever the terms of use included, no users agreed that their moods belonged to Facebook. If controlling your data means controlling your mood, then we need to take a step back.
Interestingly, the emotional contagion study is also self-serving from a corporate viewpoint. According to Kramer,
We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.
Happily of course, Kramer's study found that the above concerns about Facebook are invalid.
It all adds up to a net gain for the corporation and, as Kramer insists, a minimal loss for us. But if you're through listening to Kramer, it's worth noting that this whole episode is a net loss for Facebook, too. In the end, the minimalist legalistic, social scientific arguments are academic, in the worst sense of the word. They mean nothing. Facebook, after all, is a corporation that depends on its customers' trust, and that trust has been damaged a number of times now in important ways.
We don't know whether Facebook collaborated with the NSA by handing our data over to the agency without objection. And we don't know now whether ongoing experiments are underway with our data at Facebook, as the corporation freely and furtively uses the personal information that we thought belonged to us.
We do know that there was at least one lawyer at Facebook who believed that the applicable terms of use might not allow the company to take such liberties with our information. We also know that the corporation listened to him or her and altered the terms. So Facebook is responsive to the nuances here, once they're pointed out. Let's hope that the corporation now understands that ongoing experimentation like this is unethical and it stops.
Bea Edwards is Executive & International Director of the Government Accountability Project, the nation's leading whistleblower protection organization. She is also the author of The Rise of the American Corporate Security State.
No comments:
Post a Comment