Ann Arbor (Informed Comment) – Frances Haugen’s whistleblower testimony on Tuesday before the Senate Commerce, Science and Transportation Subcommittee on Consumer Protection, Product Safety and Data Security highlighted Facebook’s interactions with children, but the testimony was a devastating indictment of the pernicious effects of the company’s algorithms on human societies across the planet. Haugen is extremely brave and a true American hero, having sprung thousands of pages of internal memos that show the company’s bad faith.
This damaging activity is part of what brings the company $40 billion a year in profits. Haugen maintains that Facebook would still be very profitable if it dialed back the negativity, just not quite as profitable.
Haugen is a specialist in algorithmic products that underlie search and recommendation systems. Such algorithms are what make the Facebook feed go. Haugen worked at other platforms, including Google +, and she concludes that of them all, Facebook is the most evil: “the choices being made inside of Facebook are disastrous for our children, for our public safety, for our privacy, and for our democracy.”
Haugen hangs a lot on the algorithm, and not everyone may have a firm idea of what that is. The word comes from the last name of the Baghdad-based Iranian Muslim mathematician Muhammad ibn Musa al-Khwarizmi (c.780-850 CE), whose book on algebra was extremely influential. Alkhwarizmi got corrupted in medieval Europe as algorithm.
An algorithm is just a set procedure for performing a calculation. We don’t think about it, but we deploy an algorithm every time we do long division.
You could write a code that says, every time a customer searches for conflict, show them ads for guns, and when they click on guns you show them ads for bullets, and when they click on bullets show them ads for funeral homes and pictures of shooting victims. This algorithm at each stage escalates toward violence.
It gets worse, because another thing that keeps people on the site is “Meaningful Social Interactions” (MSI). What Facebook wants is to foster more MSIs downstream, i.e. as time goes on. It can be innocent. Reaching out to an old high school friend is MSI. But the problem is that a knock-down drag-out shouting match between two users counts as MSI.
Haugen is saying that this is the kind of algorithm that Facebook uses. The company makes money by showing you ads. It can make more money if it can keep you on the site. Human beings’ most powerful feelings come from fear and anger, flight or fight mechanisms, which depend on pumping adrenaline into the bloodstream. They are what keep you most interested in a scene. Facebook’s procedures or algorithms serve up to you an escalating series of images, posts and ads that lead you toward these negative emotions, or they encourage negative interactions with other uses that spill a lot of vitriol, making you angry or afraid, but definitely engaged.
Haugen noted, “Engagement-based ranking and these processes of amplification, they impact all users of Facebook. The algorithms are very smart in the sense that they latch on to things that people want to continue to engage with. And unfortunately, in the case of teen girls and things like self-harm, they develop these feedback cycles where children are using Instagram as to self-soothe, but then are exposed to more and more content that makes them hate themselves. ”
Facebook is a machine for ramping up adrenaline, which it does daily in billions of human beings, making them more and more frightened and more and more angry. Haugen is saying that Facebook software engineers didn’t necessarily set out to produce these results, but they know that the site has this effect, but since it produces advertising dollars they refuse to dial it back:
- “I don’t think Facebook ever set out to intentionally promote divisive, extreme, polarizing content. I do think though that they are aware of the side effects of the choices they have made around amplification, and they know that algorithmic-based rankings, so engagement-based ranking, keeps you on their sites longer. You have long — you have longer sessions, you show up more often, and that makes them more money.”
One of the scarier implications of Haugen’s testimony is that Facebook administrators are not entirely in control of the algorithms. They can try to dial things back, but the algorithms have subroutines and they can continue to push your buttons hard even if the company puts in some dampers. The artificial intelligence knows what sets you off, and it serves more and more of that to you. When I said this on Twitter, one of my readers suggested that we are already in a Skynet scenario, from the James Cameron Terminator movies starring Arnold Schwarzenegger, where an artificial neural net develops consciousness and comes after human beings.
She said,
- “During my time at Facebook, first working as the lead product manager for civic misinformation, and later on counter-espionage, I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolves these conflicts in favor of its own profits.
The result has been more division, more harm, more lies, more threats, and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. This is not simply a matter of certain social media users being angry or unstable or about one side being radicalized against the other.
It is about Facebook choosing to grow at all costs, becoming an almost trillion-dollar company by buying its profits with our safety. During my time at Facebook, I came to realize the devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook. The company intentionally hides vital information from the public, from the US government, and from governments around the world.”
Haugen refers to the role Facebook has had in fanning ethnic violence:
- “What we saw in Myanmar and are now seen in Ethiopia are only the opening chapters of a story so terrifying, no one wants to read the end of it. Congress can change the rules that Facebook plays by and stop the many harms it is now causing.”
The Muslim Rohingya minority has been ethnically cleansed and some would say genocided in Buddhist Burma or Myanmar, with 700,000 people chased out. Facebook was a primary means by which the genociders whipped up hate against the Rohingya, all of whom were blamed for the actions of a handful of radicals. It would be like all white people being blamed for the Capitol insurrection.
Facebook keeps apologizing when called on the mat, oops, sorry about that genocide we helped foment. Haugen is saying that these apologies are entirely insincere. Facebook makes money by keeping people glued to its site, and it accomplishes that by making them adrenaline junkies, simpering with terror and bursting with rage. Haugen is saying that this is the business model, and Facebook is not going to give it up unless someone makes them.
—–
Bonus video:
Facebook Whistleblower Frances Haugen testifies before Senate Commerce Committee