The TikTok removed more than 104,54 million videos in the first half of 2020 for violating its instructions and terms. This number represents less than 1% of the total videos uploaded to the Chinese platform. Most of the videos removed came from users from India And the USA.
According to the recent report of TikTok, about 96,4% of inappropriate videos were identified and removed before reporting them users, while many others (90,3%) were removed before they managed to get even one view. The majority, ie 30,9%, was removed because it contained sexual content. Other videos were removed because endangered users or promoted illegal activities.
TikTok also partnered with governments and law enforcement authorities that did formal and “valid” requests for user information. Such requests are submitted with appropriate legal documents such as summonses, court rulings, warrants, etc. India submitted the most requests but TikTok complied with 79%. Other countries with many such requests were USA, Israel and Germany. TikTok only responded when all the conditions were met.
The company said that only in emergency situations would disclose user information without the necessary legal procedures. Such cases are the prevention of imminent death or serious bodily injury of a person.
China was missing from the list of governments that made such demands.
In addition, TikTok said it has received legal requests from governments and services law enforcement as well as copyright holders (IPs) for remove specific content which he had climbed on his platform. These requests can only be met if they are made through "appropriate channels" or if required by law.
Most requests for this purpose came from Russia and India.
All platforms must work together
TikTok reported that the interim head, Vanessa Pappas, sent one letter to the heads of nine social networking platforms, proposing a cooperation. The Companies they should inform each other about violent and generally inappropriate content they find on their platforms.
"Social networking and content platforms are constantly experiencing problems with posting and republishing inappropriate content, and this affects all of us, including our users, our teams, and the wider community", Said the company. “… content is transferred from one application to another. THE technology can help him automatic detection and restriction of many such contents, and coordinators and groups are often at the forefront of addressing these issues".
"Every single effort of a platform to protect them users its will become more effective through a formal, collaborative approach to early recognition and notification between companies", Said TikTok. The company suggested that collaboration between companies would significantly reduce the chances of such content appearing.