Pages Menu
Categories Menu

Posted on Jun 13, 2019 in Book Reviews, Manipulation | 0 comments

“Zucked,” by Roger McNamee

“Zucked,” by Roger McNamee

Although somewhat repetitive, and with Roger tooting his own horn, this is a fascinating story of the rise of Facebook and the unintended consequences of its business model. As Roger points out, the business model is providing a free service supported by ads, rather than a subscription model (used by other internet companies as well). The ads are lucrative because Facebook tracks everything you do, and figures out which ads would appeal to you, providing customized targeting not possible in broadcast TV. The Facebook Newsfeed started in 2005. In 2012, advertisements were added to the Facebook Newsfeed, mixed in with posts from friends, making them impossible to ignore. The presentation is addictive with teaser, eye-catching headlines, “like” buttons that make you want to reciprocate with your friends, notifications and many other features. Like buttons have been installed on nearly every website outside Facebook, so Facebook can track your activity off Facebook as well, further refining its ad targeting. Facebook provided Facebook Connect in 2008, which gives you the convenience of logging onto other sites through Facebook without having to remember a bunch of different passwords. Facebook uses this to track you on those sites, gathering further data to use for ad targeting. The artificial intelligence behind the ads learns over time what you like, and provides more of the same. The system is designed to be addictive, since the more time you spend on Facebook, the more of their ads you see. It often isn’t clear what is an ad, because anyone can pay to “boost” any posting, which aren’t obviously ads unless you note the recent “sponsored” tag.

Roger mentions a number of unintended consequences. The artificial intelligence which provides more of the ads/posts it thinks you’ll like can’t tell the difference between true and false. It turns out that the most successful ads, or boosted posts, are the inflammatory ones that often aren’t true. Thus, the AI algorithm unknowingly promotes more of these since they are financially successful. Since people are fed more of what they like, “filter bubbles” are established, so liberals only see liberal content and conservatives only see conservative content. The result is that we are divided and inflamed. Roger points out that 1/3 of the US population identifies with ideas that are demonstratively untrue, and a far larger number have no regular interaction with people who disagree with them or have a radically different life experience. Groups take advantage of this for political purposes, including the Russians, who used Facebook in the 2016 election and also to sow general divisiveness and weakening the U.S. from within, such as causing pro and anti-Muslim demonstrations at the same time and same place. Facebook’s response to the Russian manipulation included prioritizing posts from friends over journalists. But since inflammatory and divisive content usually arrives via friends, Roger suggests this can make the problem worse. The unintended amplification of hate speech “effectively abridged the rights of the peaceful to benefit the angry.”

The book describes how Facebook has also maintained a virtual monopoly by acquiring upstart competitors, such as Instagram and WhatsApp. With a monopoly, Facebook could then reduce the natural spreading of posts to force people to pay. “Every year or so, Facebook would adjust the algorithm to reduce organic reach.” Thus, a post that would in the past get more clicks was demoted in the news feed to favor paid posts. Facebook has escaped antitrust scrutiny because the current antitrust policy, put in place during the Reagan administration, is that a monopoly is not a problem unless it results in higher prices to consumers. Other damage to consumers, other companies and the country aren’t enough.

Roger proposes a number of solutions. He supports antitrust enforcement and in particular forcing Facebook to divest Instagram and WhatsApp. He would like to see a button that allows users to toggle between filtered content tailored to what they like, and unfiltered content that would allow them to see other opinions. He would like to see Internet companies in general embrace “human-driven design” which prioritizes users’ well-being instead of using users as products for advertisers. These all seem like good ideas – read the book for more.

Post a Reply

Your email address will not be published. Required fields are marked *