(Aug. 20, 2019) On July 3, 2019, the German Federal Office of Justice announced in a press release that it had issued a €2 million fine (about US$2.2 million) against Facebook Ireland Limited for violating Germany’s Network Enforcement Act (NetzDG). The fine was issued because Facebook had failed to sufficiently fulfill its reporting duty under the Act. The Federal Office of Justice specified that Facebook’s first transparency report, published in July 2018, included incomplete information on the number of complaints received about unlawful content published on the platform. This is the first time a social network has been fined since the full Network Enforcement Act entered into force at the beginning of 2018. Facebook announced its plan to appeal the Federal Office of Justice’s decision and said it welcomed the additional clarity on the regulations that this process would provide.
The Network Enforcement Act passed the German Bundestag (parliament) two years ago. The goal of the Act is to compel social media platforms to take a more forceful approach to combating hate speech and “fake news.” Posts including “fake news” and hate speech that are “manifestly unlawful” must be deleted within 24 hours of users reporting them (NetzDG art. 1 § 3 ¶ 2 no. 2). If a post is not illegal on its face, the social networks have a maximum of seven days to investigate and delete the content (art. 1 § 3 ¶ 2 no. 3).
The Act also obligates social media platforms that receive more than 100 complaints about illegal content in a calendar year to publish a German report every six months detailing the way content is moderated and complaints handled on social networks (art. 1 § 2). The report must be published in the Federal Gazette and on the homepage of the social media network one month after the end of each half-year period (art. 1 § 2 ¶ 1).
The Act outlines sanctions for social media networks that intentionally or negligently violate these obligations (art. 1 § 4). The Federal Office of Justice’s guidelines on fines for violating the provisions of the Network Enforcement Act stipulate that legal persons may be fined up to €50 million (about US$55 million) for systematically failing to delete illegal content or issuing incomplete reports.
Facebook’s Reporting Found to Lack Transparency
After the deadline for publishing the first transparency report under the Network Enforcement Act in the summer of 2018, German media outlets reported that Facebook presented a much lower number of complaints than other social networks. Facebook’s report listed 886 complaints about illegal content concerning 1704 posts, of which 362 were deleted. This number was significantly lower than the numbers reported by Google and Twitter. Google received 215,000 complaints for posts on their video platform, YouTube, and deleted 58,000 posts, while Twitter reported 265,000 complaints and the removal of 29,000 posts.
This divergence in the number of complaints can be explained by the platforms’ different approaches to the configuration of the reporting mechanisms. Twitter and Google integrated the NetzDG complaint form in their already existing reporting mechanisms, which allows German users to file NetzDG complaints directly from the piece of content. Facebook, on the other hand, separated the NetzDG form from their standard flagging mechanism for posts that violate the community standards. The separate reporting form for illegal posts under the NetzDG is not directly accessible from a post but can be found via Facebook’s Help Center.
The Federal Office of Justice found that “the NetzDG reporting form is too hidden” and users who wish to submit a complaint about criminal content find themselves steered towards the standard channels, since the parallel existence of two complaint mechanisms is not made sufficiently transparent. Due to this lack of transparency, most complaints are made through the standard feedback process, including posts involving unlawful content under the Network Enforcement Act. Therefore, in the opinion of the Federal Office of Justice, Facebook’s report published in July 2018 is incomplete as it lists only a fraction of the complaints filed by users about unlawful content. The Federal Office of Justice judged that, when social networks offer more than one reporting channel, the different channels must be made clear and transparent to users, and the complaints received through all these channels must be included in the transparency report. The Office of Justice summarized that the problem with reporting only complaints made through one of the feedback mechanisms is that the public cannot judge how effectively the complaint mechanism functions.
Effectiveness of the Network Enforcement Act
The Network Enforcement Act has been controversial from the start, with some critics finding the law to be ineffective and others concerned about overblocking of content by platforms afraid of fines. In the first year that the Act was in force, the Federal Office of Justice received 704 notifications by internet users reporting that a social media platform failed to delete illegal content within 24 hours. The Office of Justice expected 25,000 notifications and 500 fine proceedings per year. One of the reasons for this low number might be the limited applicability of the regulations. One of the criteria for applying the Act is that the media platforms have more than two million German users (NetzDG art. 1 § 1 ¶ 1). Several platforms that are known for content that “incite to hatred” (Volksverhetzung) remain outside the scope of the Act because they do not fulfill the Act’s criteria of having more than two million German users or are classified under the Act as messenger apps instead of social media platforms as defined by the Act. Some platforms, such as Instagram, do not have to provide a report of their own because less than 100 users have reported complaints, which is the threshold for the reporting obligation under the Act. Further, the Federal Office of Justice does not have information on the number of social media platforms the Act applies to because the Act does not provide for the Office of Justice to actively examine which platforms would fall under the provision of the Act. The Act also provides that it be reviewed by the German parliament in 2020.
Prepared by Anne Catherine-Stolz, Law Library intern, under the supervision of Jenny Gesley, Foreign Law Specialist.