Some viewers following live coverage of the Notre-Dame Cathedral broadcast on YouTube were met with a strangely out of place info box offering facts about the September 11 attacks.
BuzzFeed first reported the appearance of the misplaced fact-check box on at least three live streams from major news outlets. Twitter users also took note of the information mismatch.
Ironically, the feature is a tool designed to fact check topics that generate misinformation on the platform. It adds a small info box below videos that provides third-party factual information from YouTube partners — in this case Encyclopedia Britannica.
YouTube began rolling out the fact-checking “information panels” this year in India and they now appear to be available in other countries.
“Users may see information from third parties, including Encyclopedia Britannica and Wikipedia, alongside videos on a small number of well-established historical and scientific topics that have often been subject to misinformation online, like the moon landing,” the company wrote in its announcement at the time.
The information boxes are clearly algorithmically generated and today’s unfortunate slip-up makes it clear that the tool doesn’t have much human oversight. It’s possible that imagery of a tower-like structure burning triggered the algorithm to provide the 9/11 information, but we’ve asked YouTube for more details on what specifically went wrong here.
Update: A YouTube spokesperson provided TechCrunch with the following statement.
We are deeply saddened by the ongoing fire at the Notre Dame Cathedral. Last year, we launched information panels with links to third party sources like Encyclopedia Britannica and Wikipedia for subjects subject to misinformation. These panels are triggered algorithmically and our systems sometimes make the wrong call. We are disabling these panels for live streams related to the fire.