简体中文
繁體中文
English
Pусский
日本語
ภาษาไทย
Tiếng Việt
Bahasa Indonesia
Español
हिन्दी
Filippiiniläinen
Français
Deutsch
Português
Türkçe
한국어
العربية
Abstract:TikTok will likely follow the same trajectory as YouTube, in the sense that its global growth will mean a larger need to effectively moderate content.
This is an excerpt from a story delivered exclusively to Business Insider Intelligence Digital Media Briefing subscribers.
To receive the full story plus other insights each morning, click here.
As TikTok's global user base approaches — and in some cases eclipses — that of major global platforms, its issues are also beginning to reflect those of major players: In particular, TikTok is reportedly fueling hate speech and mob violence in India, and in some cases, with fatal consequences, per Wired. In the last five months, TikTok has removed 48,000 videos promoting hate speech and violence, according to new court documents obtained by Wired.
TikTok will likely follow the same trajectory as rival open platforms like YouTube, in the sense that its explosive global growth in users will also mean a greater need to effectively moderate content. That's a central dynamic to open platforms: The more they scale, the more their content balloons, and the more difficult it becomes to efficiently and effectively police that content.
And on TikTok, as with YouTube, the algorithmic levers that promote content across the app are the same ones that can allow harmful content to have such far-reaching and damaging impacts. Like YouTube's recommendation algorithm, TikTok's elevates videos that drive the most engagement, in terms of time spent and likely whether and how frequently a user returns to rewatch. While this has propelled some users to fame, it has also amplified content that can radicalize individuals, at times with devastating real-world impact.
But by virtue of its Chinese ownership under ByteDance, we know relatively little about the app's content moderation practices compared with other platforms.
TikTok has content policies against hateful or violent content, and a moderation team that flags and removes problematic videos — but the extent of those efforts is unclear. And from what we do know, its efforts seem thin: TikTok parent ByteDance reportedly employs just 250 in India, where the app has an estimated 200 million users. That's one moderator for every 800,000 Indian users.
For comparison, we know Facebook has about 20,000 human content moderators for its 2.38 billion global users — one for every 119,000 global users — and it still struggles to rein in content abuses.
TikTok has three major constituencies to keep in mind while moderating content:
TikTok needs to ensure that users, influencers, and creators feel safe creating, sharing, and engaging with content on its platform.
TikTok needs to maintain trust with advertisers, businesses, and publishers who are increasingly looking to the app to reach young people.
Because the app needs to secure music licensing deals to supply its platform with songs, record labels, talent agencies, and recording artists are also critical. The platform's integrity will likely figure into deal renewal talks, and its respective deals with the three major labels are set to expire soon.
As it grows further, the red-hot platform will need to prioritize transparency to assure key constituencies that it's taking proper action to maintain a safe environment and avoid the pitfalls of its predecessors. Despite their efforts, large platforms have been unable to eradicate harmful content from their platforms.
As a balm for that failure, they've begun offering stakeholders greater transparency — a particularly crucial step amid heightened public scrutiny. Transparency reports — now provided by Facebook, Google, Twitter, and even Snap — are essentially progress reports, which enable platforms to share metrics on how platforms are preventing and removing harmful content, but also enables them to display the sheer scale of the problem they're up against.
Amid extreme volatility and unpredictable impacts in certain markets, the best these platforms can do — barring stricter limits — is prove that they're improving. TikTok's ascent will come with a growing requirement to prove the same, and it shouldn't wait for additional scandals to begin doing so.
Interested in getting the full story? Here are three ways to get access:
Sign up for the Digital Media Briefing to get it delivered to your inbox 6x a week. >> Get Started
Subscribe to a Premium pass to Business Insider Intelligence and gain immediate access to the Digital Media Briefing, plus more than 250 other expertly researched reports. As an added bonus, you'll also gain access to all future reports and daily newsletters to ensure you stay ahead of the curve and benefit personally and professionally. >> Learn More Now
Current subscribers can read the full briefing here.
Disclaimer:
The views in this article only represent the author's personal views, and do not constitute investment advice on this platform. This platform does not guarantee the accuracy, completeness and timeliness of the information in the article, and will not be liable for any loss caused by the use of or reliance on the information in the article.
The US personal savings rate increased from 8% in February to 13.1% in March due to lowered spending from social distancing.
The US banks with the highest levels of digital trust in 2020 are PNC, Chase, and Citibank, according to our inaugural Banking Digital Trust study.
Goldman Sachs has seen between 10% and 20% of its consumer loan customers request payment deferrals across its Marcus and Apple Card products.
Business Insider Intelligence looks at the growing online grocery shopping market and delivery trends in the industry to look out for in 2020 and beyond.