University of Texas Tech researcher Ashley Landrum attended the last two meetings of the flat-earth theorists and studied how information about the doctrine spreads on the Internet. It turned out that 99% of the participants who joined the movement in the past few years learned about it from YouTube. The rest are those who heard comments from those who watched the viral videos and were also interested.
Further research on Landrum led her to be unable to find on YouTube a rebuttal to the flat-earthers' argument. More precisely, when she used the basic tools of the platform for this purpose without applying personal experience, she failed. And only knowing exactly what to look for, at the very least, she got access to content where the beliefs of flat-earthers are reasonably criticized. But when she further wanted to look at the counter-argument, she again faced a problem - YouTube does not give one.
As a result, comparing the data with other studies, Landrum came to the conclusion that the flat or round Earth had nothing to do with it. It's about the platform's online algorithms, which willingly support the interests of fans of all kinds of conspiracy theories. And they involve them in the circulation of links to content that matches the topic of the request, weeding out criticism. As a result, YouTube creates conditions when referring to any topic limits the search results within that topic.
It turns out that both true information and disinformation are spreading on the Internet today, as it were, in parallel, without intersecting with each other. And a user who has begun to take an interest in something has very little chance of learning an alternative point of view, although there is more than enough such information. At least this is the case with YouTube, and therefore it shouldn't be surprising that the number of followers of the “Flat Earth” theory is growing - the topic is trending and the platform's algorithms support it.