Saturday, March 24, 2018

ICYMI - the NYT Story on Facebook’s Chief Information Security Officer Leaving

So I came across a link to this story again this morning -- I'd seen references to it earlier in the week. The story, Facebook Exit Hints at Dissent on Handling of Russian Trolls by Nicole Perlroth, Sheera Frenkel And Scott Shane was published on March 19. I think that the first thing I saw about this article was some back and forth about the NYT had changing the article to soften the treatment of Sheryl Sandberg, but I didn't actually dive into the piece at that time. What drew me back to this piece was a reference that I saw to this quote from a former Facebook employee.
“The people whose job is to protect the user always are fighting an uphill battle against the people whose job is to make money for the company,” said Sandy Parakilas, who worked at Facebook enforcing privacy and other rules until 2012.
Not that that isn't apparent from the Mark Zuckerberg interviews from this week. Clearly the business is in full damage-control-spin mode. In fact, as you read through the piece, it's hard not to come away with the feeling that Facebook management is attempting to do everything they can to avoid really addressing this issue. And they certainly don't appear to be making substantive changes to their operations. At the heart of this is probably the recognition that these issues strike at the heart of their business model.

Personal Data -> Super Advertising Demographic Targeting -> In a Box
In reflecting on it, I'm reminded of this story from 2012, How Companies Learn Your Secrets, about Target's big data team. This is the story where they took all of Target took all of their purchasing data, linked it with a bunch of demographic data, and then statistical analysis, they were able to predict things like when a customer was pregnant based on their purchasing habits.

In some sense, what Facebook does is take all of this advanced technical work that Target did, and sell it to advertisers, pre-packaged and conveniently gift-wrapped. In many respects, the issues from the 2016 election, Cambridge Analytica, and Facebook are all stories about this aspect of marketing. Consider this quote from that story:
“With the pregnancy products, though, we learned that some women react badly,” the executive said. “Then we started mixing in all these ads for things we knew pregnant women would never buy, so the baby ads looked random. We’d put an ad for a lawn mower next to diapers. We’d put a coupon for wineglasses next to infant clothes. That way, it looked like all the products were chosen by chance.
“And we found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons. She just assumes that everyone else on her block got the same mailer for diapers and cribs. As long as we don’t spook her, it works.”
Of course, as I've noted in the past, with Facebook and our broader experiences on the web, there's a built-in aspect of believing that "everyone else on the block got the same" page / view / experience. In the case of Facebook, their core platform and their business model is all about this data -- masked by the presence of family and friend photos so that they "don't spook" users.

No comments: