Algorithms Don’t Polarize People, People Do [...]

Facebook claims that individual choice limits exposure to cross-cutting content more than algorithms.

“Individual choice has a larger role in limiting exposure to ideologically cross cutting content [than the News Feed algorithm],” a recent study by Facebook’s own data team ruled. “We show that the composition of our social networks is the most important factor limiting the mix of content encountered in social media.”

Chart showing increasing polarization dating back to the 1970s

In other words, the thing most polarizing people online is people themselves — a phenomenon that the latest string of anti-Trump apps, browser extensions and add-ons would not appear to help. On top of the unfriending site, there’s an iPhone app called Trump Trump that will eliminate the candidate’s name from the websites you’re browsing, as if he didn’t exist. Remove Donald Trump from Facebook will, as its name suggests, scrub the candidate from your News Feed. A mountain of Chrome extensions will replace Trump’s name or picture with a series of other things: “Voldemort,” “your drunk uncle at Thanksgiving” — even the smiling poop emoji. (Source)

Nailing down the ethical responsibilities of algorithms is a part of Algorithmic Accountability
Degree Assortativity characterizes most human networks, and is resistant to inflow of outside ideas.

Wikity users can copy this article to their own site for editing, annotation, or safekeeping. If you like this article, please help us out by copying and hosting it.

Destination site (your site)
Posted on Categories Uncategorized