Trolling is a disruptive online behavior aimed at eliciting strong emotional reactions from individuals or communities. When targeted at an individual, it shares similarities with harassment, and is often seen as an individual problem that is easy to identify and ignore, hence the common internet advice: "Don't feed the trolls".
However, trolling isn't always aimed at specific victims. Another prevalent form is "outrage baiting", where content is designed to trigger public anger for the sole purpose of driving traffic. As an example, consider "Karen" videos, which depict an entitled and belligerent antagonist being awful to an innocent worker. The genre has been a consistent feature of the viral internet across a variety of platforms including Youtube, Reddit, and now, TikTok. While the form likely arose and gained popularity organically, it now seems that a cottage industry of attention seekers have been staging these videos as outrage bait. This dynamic is even worse on platforms like X (formerly Twitter) that reward attention not just with on-platform spoils of followers and likes, but financial incentives.
Identifying trolling content is complex; relying solely on the content itself is insufficient. Evaluating patterns of behavior within an account, its network associations, and the likelihood of the content existing in a genuine context is crucial. This often requires significant time and expertise, making it impractical for most users to routinely assess.
Reducing the frequency of trolling cannot be done as a matter of content moderation, because one cannot drain the ocean with a straw. Algorithms designed to maximize user engagement promote trolling content, precisely because it solicits an emotional response. This feedback loop isn't just at the heart of the proliferation of trolling content, but its source. Trolls are producing this content precisely because it will be rewarded by the algorithm. Until platforms reconsider their fundamental business model and reward systems, trolling is a feature, not a bug.