Wow, what a couple of weeks it’s been for Facebook and its seemingly out-of-control algorithms: audience views for legitimate outlets are plunging; ad categories catering to racists were created somehow; Russian troll farms papering the site with inflammatory ads during elections.
Whew! It’s almost like no humans are in charge over there, except for popping up at Ad Week and other conferences to talk about how much money they’re making and how many people are using their platforms these days.
Yes, Facebook has roughly 2 billion users (and Instagram just crossed 800 million users, the company announced at Ad Week) And yes, Facebook is raking in many billions of dollars from digital ads every quarter. After Google, Facebook makes more from digital advertising than any company in the world.
But when you occupy such a central place in the Internet, you have some responsibilities. What we’re continuing to see from Facebook is a string of irresponsibilities that have actively undermined our culture, politics and community fabric. Who’s taking responsibility for that?
As the the company digs out of a series of controversies, most self-inflicted, one has to wonder whether it’s time for some cleanup at the top of the admittedly hugely successful company.
Let’s review. To start, one of the Chicago Tribune’s top digital editors, Kurt Gessler, detailed their internal analysis of dramatic changes in the way their posts were performing on Facebook over the past several months. To put it simply, the Trib is striking out a lot more often with Facebook readers than it was this time last year.
About one in three posts now is a stiff, compared to a rate in 2016 that was closer to one in 90. That’s a painful shift for the Trib’s bottom line, no doubt.
Gessler looks at some of the possible sources of the high strikeout rate, including the company’s decision not to do Instant Articles, He also compares the Trib’s content with that a couple of dozen competitors on issues such as post frequency and use of video. They’re somewhere in the middle. Basically, there aren’t any obvious culprits for the decline.
Gessler’s still-puzzled conclusion: Facebook has fiddled with its algorithm in some significant ways. I’d be interested to see if other, especially text-centric publishers are seeing similar problems.
No one owes the old-line newspapers a living, of course. But in terms of choosing which outlets to highlight and which to bury, might it make sense to give a little more weight to the professional news organizations staffed by trained and dedicated journalists? You’d think the Facebook algorithm might treat their work with a bit more love than, say, the fake-news generators who got so much lucrative access last year.
On a related front, CEO Mark Zuckerberg said the company would turn over to Congress material related to 3,000 ads that it had sold last year to a Russian troll farm that were used to target millions of voters in the 2016 elections. Facebook had already provided the material to Robert Mueller’s independent investigation of Russian election manipulation and possible Trump campaign involvement.
Reports suggest the 3,000 ads were targeted to audiences likely to be stirred up over Muslim immigrants and Black Lives Matter activists.
The investigations have sparked Congressional interest in regulating what’s happening here. A new bill would target the shadowy political ads on Facebook, Google and Twitter that operate with little oversight. I’m skeptical such a bill would necessarily be effective or even Constitutional, but it suggests political concerns are rising, a problem for Facebook going forward.
Tied to all this, the Tow Center for Digital Journalism has been doing a deep dive on ad-tech companies at the heart of the election manipulations, looking at a web of more than 500 companies. As part of that, in this piece on Medium, they spotlighted 10 of the biggest enablers, including Google partner Acxiom and one of Comcast’s many divisions.
Altogether, Tow’s investigation looked at “the deep layer of ad tech, content customization and targeting technologies, and A/B testing platforms that this “fake news” behavioral tracking infrastructure is meant to ‘deliver on.'”
Importantly, of course, Facebook’s ad-targeting tools are at the center of the Internet’s business operations, and thus at the center of this web of manipulation. Facebook tools were a crucial fulcrum in election meddling. What’s unclear from Facebook is how the company’s executives view their responsibilities in elections going forward.
And on yet another front this past week, Facebook COO Sheryl Sandberg acknowledged in a post that the company let buyers target ads to people who self-identified with topics such as “Jew hater,” “How to burn jews,” or, “History of ‘why jews ruin the world.’”
“Seeing those words made me disgusted and disappointed – disgusted by these sentiments and disappointed that our systems allowed this,” Sandberg wrote. “Hate has no place on Facebook – and as a Jew, as a mother, and as a human being, I know the damage that can come from hate.”
The acknowledgements came after a ProPublica investigation found that the ad categories even existed, something Sandberg acknowledged the company itself hadn’t realized.
That Facebook’s top two executives are both Jewish makes this particular black eye a particular “fail,” as Sandberg termed it. Her post came at the start of Jewish High Holy Days. Happy New Year everyone!
To the company’s modest credit, Facebook moved quickly on this one, eliminating the racist categories and freezing 5,000 others pending a review.
Sandberg promised a string of improvements in oversight and enforcement, with oversight coming from both more humans and more aggressively deployed artificial intelligence.
More AI may indeed help, but clearly, the company’s seemingly untended algorithms are a big source of the problems it’s now finding itself repeatedly mired in.
If the humans can’t be bothered to do a better job safeguarding their company, billions of users and the body politic as a whole from the unintended consequences of misuse of their power, perhaps it’s time for new humans at the top. Put that in your Newsfeed.