Monday, June 30, 2014

Facebook Can (and has) Manipulated You With Their Feed Algorithm

I came across this post from David Holmes over on Pando. It may have been in the news previously, but this was the first time that I'd seen anything about it. What happened, specifically, is that people over at Facebook undertook some basic experiments on their users. They modified the algorithm that delivers the feed to users, and for one set, they delivered more negative, less positive emotionally colored results, and for the other set, they delivered more positive, less negative emotionally colored results. As you might imagine, both groups essentially began echoing and amplifying the tone of their feed -- the negative group producing more negative content while the positive group produced more positive emotions.

The Pando post calls into question the idea that Facebook would simply experiment on it's user base like that, without any permission beyond the basic terms of service. And, while I appreciate the sentiment and feel that, at some level, there ought to be a deeper underlying set of online rights of self, I find the idea that businesses experimenting with algorithms without the consent of their user base to be wholly unsurprising. After all, much of this is the essence of serving online content -- A/B and multi-variant testing has been a common practice in online marketing for several years.

People want to believe that the content they are seeing is the same for everyone, that it's like a book. Once it's printed, that is what it is. But dynamic content, by it's fundamental nature, is manipulated.

For me, I see a much bigger danger implied by the Facebook study reference here. If you think of the ability to influence in this way, then the Facebook "scientific" paper is potentially a press release announcing their offering of an entirely new type of advertising. Imagine if you wanted to broadly shift public option. With the right amount of money and access to the right platform, you could pay for shifting the filter of the algorithm. This is not an Adwords type of program, accessible to anyone. Instead, it's a much more exclusive, strategic type of marketing. Probably expensive, with very limited availability -- like product placement.

Perhaps the scariest place for it, and it's most likely target, is with politics and political advertising. Consider how this type of behavior manipulation might be used for political influence. Depending upon who is operating the platform, how sophisticated they are with their manipulation, and what kinds of limits or controls that they place on this type of influence, the not-entirely-offbase conspiracy thinking can set your brain on fire. It makes you suspicious of any political content in the social media context; except, as Facebook has shown, people are manipulated without even knowing.

Clearly, it's an eye-opener and some serious food for thought. Is this where Facebook finally announces the real influence of the platform that they've been building, the platform that everyone has believed to be worth so much more than social media advertising has proven to be worth, to date?

No comments: