On October 4-5, the Facebook websites and mobile applications were not available for more than 6 hours. Amid Facebook's troubles with Instagram's negative impact on teens, accusing the company of spreading fakes, including vaccinations, and long-standing antitrust claims against Zuckerberg's company, problems accessing the top three services could be costly for the firm.
At the very least, it could reinvigorate Facebook's long-standing antitrust investigation, as well as expedite concrete decisions aimed at curbing that power.
"All is lost"
On the evening of October 4, users around the world began to complain about problems with accessing Facebook and Instagram, as well as about the work of Facebook's mobile applications. The Facebook and Co crash was displayed by the service DownDetector that analyzes the working capacity and availability of online services. Facebook, Instagram, and WhatsApp also experienced problems.
The social networks crash was called the largest in the last 13 years—it lasted more than 6 hours and, unlike the website shutdown in 2008, affected billions of users on all three platforms.
Many of them began to actively use other social platforms, in particular, Telegram, Twitter, and YouTube that also crashed from time to time.
For example, Telegram confirmed the difficulties in work due to the influx of new users.
Facebook explained the access difficulties by routing errors, although theories about a global attack on the company sounded from time to time. Relatively speaking, the servers of the "big Internet" could not find sites owned by Facebook.
The problem was compounded by errors that made it difficult to access the company's internal tools that could be used to fix it faster. For instance, security specialists could not get into the room with the servers, because their passes did not work. Internal communication systems did not work—corporate chat, with the help of which employees discuss important issues. To coordinate actions, specialists had to write emails to each other.
Commenting on the crash of services, Facebook noted that the data of social network users, most likely, were not affected by this problem. However, literally at the same time, information appeared on the Internet about the sale of personal data of more than 1.5 billion Facebook services’ users.
This set may be the result of semi-legitimate data parsing—similar data was published this spring. However, the near-simultaneous crashing of the company's services and publishing data of its clients makes it doubtful that Facebook is a reliable company that cares about its users.
A series of revelations about Facebook: VIP users, Instagram and kids, and Project Amplify
September has become a "black" month for the company. A series of articles about Facebook's problems have appeared in The Wall Street Journal and The New York Times.
It all started with the leaked internal Facebook documents that dealt with the existence of an elite category of users who were allowed to violate the company's rules.
Their use of the social network is regulated by documents called XCheck. Such users are allowed to publish things ordinary people would be blocked on the platform for. The basic rule of working with conditional Facebook celebrities is to avoid possible PR scandals that may arise if their content is removed or their right for publishing on the site is restricted.
The total number of such users exceeded 5 million. Donald Trump was one of these "elite" users—this can explain his permissiveness on Facebook and the company's lack of reaction to the published content. For example, the administration of Twitter removed such content.
The second block of revelations concerned the impact of Instagram on the mental health of teens. The outlet said that the social network, filled with ideal filtered photos, does not influence the young audience in the best way. Instagram worsens their perception of themselves and their body, they feel anxiety and depression, and more often think about suicide.
The journalists noted that the company knows about the problems, but does nothing to solve them, because the existing mechanics of the social network allows it to make good money. And the idea of hiding the number of the "likes" on Instagram did not have much effect.
After comparing Instagram and Facebook's approach to content with tobacco manufacturers' approach to the harm of smoking, one of the Facebook executives compared the use of Instagram to the harms of cars that cause road accidents.
In addition, the company responded to these claims by accusing journalists of distorting facts and one-sided interpretation of the information. It also published a report on Instagram’s impact on the mental health of their users. Although an adequate reaction can be called the announcement that the project Instagram Kids—a social network for children under the age of 13—is being put on hold for now.
At the same time, the US Congress began investigating the specifics of Instagram and its use by teens.
Another story related to Facebook was published by the journalists of The New York Times. It talks about the intensification of an initiative called Project Amplify to promote positive news about the company and its leadership.
In other words, advertising its own activities on its own advertising platform became the response to criticism of Facebook. As conceived by the authors of this idea, positive news about Facebook should overshadow the increasingly active claims to the social network and its projects.
Little-studied manipulation tool
If accusations about VIP users and articles about the impact of the Instagram content on teens are relatively new, then the claims that Facebook is almost the main tool for manipulating public opinion and the source of the spread of fakes have been around for many years.
A recent report on Facebook's most viewed content confirms that these problems are still there and have become even more serious. The document refers to the massive dissemination of propaganda thanks to the Facebook recommendation system. For example, 75% of the propaganda manipulative content was seen not by the subscribers of the pages that published it, but by random users.
Many scientists have long been trying to understand how social network algorithms work, but it turned out that the company increasingly blocks research on its algorithms, and sometimes provides scientists with incorrect data. There are situations when scientists are blocked on the platform or even threatened.
Facebook VP Nick Clegg admitted in the recent interview that the company was not opposed to providing data for research, but did not yet understand how to do it without prejudice to itself.
Is it a profit generating machine?
The publication of articles about Instagram and teenagers, other facts about the peculiarities of Facebook's work were based on the leaked documents organized by Frances Haugen, a former senior manager of the company, who had got access to classified documents and had decided to make them public.
On the evening of October 5 (US time), Frances testified before Congress on these documents. The main idea she voiced repeated the previous conclusions of journalists: Facebook had a good idea of how the algorithms of its social platforms work and what threats they pose to society, but they did nothing about it, because the main priority of the company was and is generating profit.
And the company did pretty well if we recall that the algorithmic news feed promoted the most talked about content, often with clickbait headlines, which was heavily used by politicians. Frances Haugen suggested improving Instagram, for example, with the help of an 18-year-old age limit and abandoning algorithmic feed.
In response, Mark Zuckerberg denied all allegations on his Facebook account and stated that the company was actually concerned about the safety of its users, and the purpose of its algorithmic feed was to allow people to spend more time online with their loved ones.
However, despite Facebook's best efforts to make money, its underlying business model has been suffering lately—after launching Apple's updated privacy tools and restricting advertisers' access to user data.
Therefore, all of these September Facebook problems—from the site crash to the revelations by journalists and employees—can have far-reaching consequences.
What the "Black September" means for Facebook
Exposing the harmfulness of algorithms, data leaks and difficulties in accessing the service, ineffective advertising on the platform, and claims of manipulative content—all these problems of the largest social platform will have consequences. However, it is difficult to predict whether they will be positive for end users or look like redecoration after a fire.
The failure of Facebook's services has become the perfect proof of the problems that arise when too much power is in the hands of one tech giant. All of this could be another argument in the ongoing antitrust claims against the company.
Problems with access to Facebook are another reason for the adherents of the decentralized Internet, or at least for those who are convincing to divide Facebook (as well as other monopolists) into separate independent structures.
Analyst Casey Newton gives other, less harsh ways that the company itself and the US government can go to fix the situation, improve its reputation and credibility of the social network. But the main conclusions, according to the expert, should be made regarding the entire technology market.
Platforms should take the events of the past few weeks as a cue to provide scientists with opportunities to conduct research on subjects in the public interest... Congress should pass a law requiring large platforms to provide this data. Especially if we don't want to rely on informants and the randomness of the leaked documents to understand the impact of social media on society. However, what Congress should not do is pass a sweeping law intended to solve all problems, because this will almost certainly severely restrict freedom of speech.
author of the technology publication The Platformer
The long-term prospects of Facebook as a social platform that people trust with their data looks very vague. Especially amid the fatigue from social services and the emergence of alternative niche projects. Therefore, it is possible that the story of Frances Haugen and the collapse of Facebook services could affect the popularity of its platforms and become, if not the beginning of the end of Facebook as a social monster, then a bulldozer that will stop its audience gain.