This is a catch-all approach that stems from a simple principle: one user generating a ton more data than other users tends to be a signal of abuse. What that abuse is varies widely, but the trend is broadly applicable: abuse most often happens in the tails of the distribution of activity.
In each of these cases, the volume can't (by itself) tell the platform what the user is doing, but it is a technical signal that can both (a) serve as an automatic trigger for human review (b) indicate that a small number of users are consuming a disproportionate share of the resources/space on a platform.
Setting limits on content creation, even very high ones, can help reduce the liklihood of abuses like these occurring, by putting an upper limit on the platform's usefulness for the motivations of the actor behind each.