How ambitious influencers are forced to fight against the algorithm


There are two ways to try to understand the impact of content moderation and the algorithms that enforce those rules: by relying on what the platform says, and by asking the creators themselves. In Tyler’s case, TikTok apologized and blamed an automatic filter set up to flag words associated with hate speech – but apparently couldn’t understand the context.

Brooke Erin Duffy, an associate professor at Cornell University, teamed up with graduate student Colten Meisner to interview 30 creators on TikTok, Instagram, Twitch, YouTube and Twitter around the time Tyler’s video went viral. They wanted to know how creators, especially those from marginalized groups, navigate the algorithms and moderation practices of the platforms they use.

What they found: Creators put a lot of effort into understanding the algorithms that shape their experiences and relationships on these platforms. Since many creators use multiple platforms, they have to learn the hidden rules for each one. Some creators are adapting their entire approach to content production and promotion in response to the algorithmic and moderation biases they encounter.

Below is our conversation with Duffy about her upcoming research (edited and condensed for clarity).

Creators have long discussed how algorithms and moderation affect their visibility on the platforms that made them famous. So what surprised you the most while doing these interviews?

We had a sense that creators’ experiences were shaped by their understanding of the algorithm, but after the interviews, we really began to see how deeply [this impact] is in their daily life and work… the amount of time, energy and attention they devote to learning about these algorithms, investing in them. They have that kind of critical awareness that these algorithms are considered uneven. Despite this, they still invest all this energy in the hope that they will be understood. It only draws attention to the one-sided nature of the creator economy.

How often do creators think about the possibility of being censored or their content not reaching an audience due to algorithmic suppression or moderation?

I think that basically structures their content creation process as well as their content promotion process. These algorithms change at will; no insight. In many cases, there is no direct communication with the platform. And this completely, fundamentally affects not only your experience, but also your income.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *