Facebook Knows It "Our algorithms exploit the human brain's attraction to divisiveness,"

Jeff Horwitz and Deepa Seetharaman, reporting for Wall Street Journal:

Facebook Knows It Our algorithms exploit the human brain's attraction to divisiveness,

A Facebook team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart. “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.” That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior? The answer it found, in some cases, was yes.

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about “sensationalism and polarization.” But in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products. Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.

2 Likes