Mark Zuckerberg and his company have been plagued over the past few years by a series of scandals prompting critics to urge users to "delete Facebook." Now, Facebook has an answer: It will delete itself.
The social network is not going anywhere, of course. But it will become, if a 3,200-word announcement Zuckerberg released Wednesday is to be believed, much less of one. The site will allegedly transform from an open conversations platform to a "privacy-focused communications platform" revolving around end-to-end encryption and commercial transactions. Facebook's focus since its founding has been to connect the world. This week, the company decided it would rather connect each of us to a small group of our friends. The change matters.
Years ago, when Facebook was barely out of the dorm room, most Americans believed worldbroadening innovations such as social media were inherently good. But openness ushered in ugliness along with the democratization that platforms' founders had promised. Misinformation ran rampant. Consumers paid for these public and open services with their data, in troves. The solution until now for chief executives still committed to early-days idealism was to build structures shoring up their services against myriad threats. Zuckerberg is presenting an alternative: give up and try something new.
Exactly how dramatic Facebook's shift will be remains to be seen. The idea is to start with end-to-end encrypted messaging by merging communications across Facebook, WhatsApp and Instagram, and then to build on top of that whatever services the company can include. What that means for the news feed, and on what timeline, is unclear.
Facebook's changes may be motivated in part by a desire to avoid European regulators' attempts at breaking apart its properties. They may be motivated, too, by declining use, especially among young people, and a need to search for alternative revenue streams.
But the platform is also moving away from what makes it Facebook because being Facebook has become too difficult. Privacy advocates demand that Facebook stay away from their information, while critics want the platform to crack down on those who have turned it dangerous. Playing lawmaker, cop and court all at once comes at the cost of political controversy and even human harm. Content moderators, the Verge's Casey Newton reported recently, are being diagnosed with posttraumatic stress disorder from sifting through beheadings, suicide footage and child pornography.
Facebook is not giving up on its business, but it is giving up on its vision. Whether focusing more on private conversations and less on public displays will be better for society is hard to say. Misinformation thrives on WhatsApp already, in part because it is impossible for a platform to police content that it cannot see. Now, if things get bad, Facebook won't be able to see it either. The ugliness won't go away, yet Facebook's eyes will be closed.
— The Washington Post