In Germany, the NetzDG came into force in October 2017 and is primarily intended to combat hate speech and fake news on social networking platforms, but contains a total of 21 criminal offences. According to this text, social media platforms are required to delete obviously illegal content after 24 hours. Content that is not obviously illegal must be deleted after 7 days or forwarded to a recognised self-regulatory body.
Since March 2020, platforms shall transfer those cases that are not obviously illegal and difficult to evaluate legally to the FSM, where the content is then examined by an external, independent panel of experts. This NetzDG review committee includes about 50 lawyers who decide on the cases independently of the platforms, and the FSM. The platforms are bound by the committee's decisions and must ensure that the content, should it be deemed illegal, is no longer accessible in Germany.
The FSM's many years of experience as self-regulators in the field of online youth protection is now contributing to the work within the NetzDG. The independent experts' decisions can provide guidelines for dealing with hate speech on the borderline of legality in the future. The decisions are published in anonymous form on the FSM website.
More information can be found on the website of FSM.
This article was initially published on the INHOPE website and is reproduced here with permission.
Find out more information about the work of the German Safer Internet Centre (SIC) generally, including its awareness raising, helpline, hotline and youth participation services, or find similar information for Safer Internet Centres throughout Europe.