A documentary posted on Netflix last month has elicited a lengthy comment and a little angry-face emoji from Facebook.
The social media giant has posted a lengthy rebuttal to director Jeff Orlowski’s 93-minute documentary, which includes interviews from former high-level product engineers at Facebook, Google, Twitter and other top tech companies.
The common theme: These engineers first created these platforms with the belief that that were building tools that drew people closer. But they came to realize that through monetization of these platforms, they were tapping into the profoundly powerful science of brain chemistry and motivation, manipulating consumers in ways that have been polarizing and corrosive for global society.
Netflix hasn't disclosed any kind of viewership metrics for the documentary, which debuted Sept. 9. But it's potentially damning stuff. And it seems to have hit home at the world’s largest social media company.
“Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems,” Facebook said in its post.
Facebook pushes back in seven areas. One is the idea that it functions as a kind of modern Phillip Morris, creating a product it knows is addictive. The social media company claims it is not engaged in a manic, pathological quest to drive up usage and maximize advertising dollars.
“Our News Feed product teams are not incentivized to build features that increase time-spent on our products. Instead we want to make sure we offer value to people, not just drive usage,” Facebook said. “For example, in 2018 we changed our ranking for News Feed to prioritize meaningful social interactions and de-prioritize things like viral videos. The change led to a decrease of 50M hours a day worth of time spent on Facebook.”
Facebook defended its use of algorithms, while calling hypocrisy on Netflix.
“Facebook uses algorithms to improve the experience for people using our apps—just like any dating app, Amazon, Uber, and countless other consumer-facing apps that people interact with every day,” Facebook said. “That also includes Netflix, which uses an algorithm to determine who it thinks should watch The Social Dilemma film, and then recommends it to them. This happens with every piece of content that appears on the service. “
Regarding criticisms about data usage and privacy, as well as how it has influenced elections, Facebook said it has changed.
“We’ve acknowledged that we made mistakes in 2016,” the company said. “Yet the film leaves out what we have done since 2016 to build strong defenses to stop people from using Facebook to interfere in elections. We've improved our security and now have some of the most sophisticated teams and systems in the world to prevent attacks. We've removed more than 100 networks worldwide engaging in coordinated inauthentic behavior over the past couple of years, including ahead of other major global elections since 2016.”
As for the notion that its social network has put us all in a bubble, unable to recognize others’ world views, well, it was like that when we got here, Facebook said.
“The truth is that polarization and populism have existed long before Facebook and other online platforms were created, and we consciously take steps within the product to manage and minimize the spread of this kind of content,” the post claims.
Daniel Frankel is the managing editor of Next TV, an internet publishing vertical focused on the business of video streaming. A Los Angeles-based writer and editor who has covered the media and technology industries for more than two decades, Daniel has worked on staff for publications including E! Online, Electronic Media, Mediaweek, Variety, paidContent and GigaOm. You can start living a healthier life with greater wealth and prosperity by following Daniel on Twitter today!
The smarter way to stay on top of the streaming and OTT industry. Sign up below.
Thank you for signing up to Next TV. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.